datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
joey234/mmlu-high_school_us_history-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 28257
num_examples: 5
- name: test
num_bytes: 1258154
num_examples: 204
download_size: 55081
dataset_size: 1286411
---
# Dataset Card for "mmlu-high_school_us_history-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamdanXI/paradetox-preprocess-editOps | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: en_toxic_comment
dtype: string
- name: en_neutral_comment
dtype: string
- name: edit_ops
sequence:
sequence: string
splits:
- name: train
num_bytes: 4628797
num_examples: 19744
download_size: 1848112
dataset_size: 4628797
---
# Dataset Card for "paradetox-preprocess-editOps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-1b5bc0-46134145181 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: andi611/bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: andi611/bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nomic-ai](https://huggingface.co/nomic-ai) for evaluating this model. |
huggingartists/phish | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/phish"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.372501 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/df85b83684e95f87794aa09580ee0463.919x919x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/phish">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Phish</div>
<a href="https://genius.com/artists/phish">
<div style="text-align: center; font-size: 14px;">@phish</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/phish).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/phish")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|394| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/phish")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
FINNUMBER/FINCH_TRAIN_NQA_ARI_100_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 279758
num_examples: 100
download_size: 175553
dataset_size: 279758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YAGO1818/LunaVB | ---
license: apache-2.0
---
|
carlavic/Moedas | ---
license: openrail
---
|
YiYiXu/yiyi_test_ds | ---
dataset_info:
features:
- name: image
dtype: image
- name: condtioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 22659003.0
num_examples: 15
download_size: 22663578
dataset_size: 22659003.0
---
# Dataset Card for "yiyi_test_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/partitioned_v2_standardized_07 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
splits:
- name: train
num_bytes: 33994535.7580061
num_examples: 66505
download_size: 36546793
dataset_size: 33994535.7580061
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_07"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/seireinomoribito | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Seirei No Moribito
This is the image base of bangumi Seirei no Moribito, we detected 26 characters, 2981 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 593 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 73 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 487 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 450 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 327 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 79 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 33 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 126 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 81 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 46 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 43 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 73 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 16 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 172 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 96 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 18 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 62 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 18 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 53 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 24 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 28 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 11 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 14 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 12 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 12 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 34 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
piercemaloney/coqgym_coq_projects_v1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: finmap
num_bytes: 745110
num_examples: 3
- name: GeometricAlgebra
num_bytes: 2180457
num_examples: 8
- name: bdds
num_bytes: 11537326
num_examples: 15
- name: topology
num_bytes: 1998914
num_examples: 32
- name: euler_formula
num_bytes: 2157257
num_examples: 3
- name: ruler_compass_geometry
num_bytes: 105862
num_examples: 10
- name: twoSquare
num_bytes: 219009
num_examples: 2
- name: zfc
num_bytes: 605621
num_examples: 11
- name: shuffle
num_bytes: 87431
num_examples: 6
- name: three_gap
num_bytes: 315458
num_examples: 7
- name: regexp
num_bytes: 116907
num_examples: 7
- name: automata
num_bytes: 1075636
num_examples: 25
- name: izf
num_bytes: 146423
num_examples: 8
- name: railroad_crossing
num_bytes: 202543
num_examples: 1
- name: idxassoc
num_bytes: 139339
num_examples: 2
- name: hoare_tut
num_bytes: 39069
num_examples: 3
- name: lesniewski_mereology
num_bytes: 123636
num_examples: 2
- name: additions
num_bytes: 147048
num_examples: 20
- name: checker
num_bytes: 6689
num_examples: 2
- name: domain_theory
num_bytes: 67545
num_examples: 4
- name: propcalc
num_bytes: 54422
num_examples: 5
- name: area_method
num_bytes: 1578799
num_examples: 38
- name: ails
num_bytes: 521926
num_examples: 11
- name: dep_map
num_bytes: 24951
num_examples: 2
- name: markov
num_bytes: 102887
num_examples: 1
- name: rsa
num_bytes: 180155
num_examples: 5
- name: goedel
num_bytes: 9110441
num_examples: 44
- name: bigenough
num_bytes: 5031
num_examples: 1
- name: generic_environments
num_bytes: 142529
num_examples: 2
- name: ctltctl
num_bytes: 61038
num_examples: 3
- name: schroeder
num_bytes: 24958
num_examples: 4
- name: weak_up_to
num_bytes: 175365
num_examples: 10
- name: groups
num_bytes: 11115
num_examples: 1
- name: pocklington
num_bytes: 617483
num_examples: 13
- name: mini_compiler
num_bytes: 14459
num_examples: 1
- name: StructTact
num_bytes: 293887
num_examples: 17
- name: exceptions
num_bytes: 8962
num_examples: 1
- name: coqrel
num_bytes: 196290
num_examples: 12
- name: higman_s
num_bytes: 198793
num_examples: 5
- name: rem
num_bytes: 14504
num_examples: 1
- name: tree_automata
num_bytes: 1972575
num_examples: 17
- name: higman_cf
num_bytes: 49720
num_examples: 2
- name: coqoban
num_bytes: 38285
num_examples: 1
- name: search_trees
num_bytes: 59332
num_examples: 5
- name: ieee754
num_bytes: 35416
num_examples: 3
- name: jordan_curve_theorem
num_bytes: 12530912
num_examples: 10
- name: hedges
num_bytes: 360884
num_examples: 1
- name: zorns_lemma
num_bytes: 608752
num_examples: 19
- name: tortoise_hare_algorithm
num_bytes: 10084
num_examples: 1
- name: mod_red
num_bytes: 636981
num_examples: 5
- name: traversable_fincontainer
num_bytes: 429019
num_examples: 1
- name: buchberger
num_bytes: 2422607
num_examples: 29
- name: constructive_geometry
num_bytes: 80179
num_examples: 7
- name: tarski_geometry
num_bytes: 112419
num_examples: 8
- name: int_map
num_bytes: 817835
num_examples: 13
- name: float
num_bytes: 2994074
num_examples: 31
- name: InfSeqExt
num_bytes: 106656
num_examples: 5
- name: zchinese
num_bytes: 62626
num_examples: 6
- name: smc
num_bytes: 6045540
num_examples: 15
- name: pts
num_bytes: 72482
num_examples: 8
- name: param_pi
num_bytes: 2596347
num_examples: 11
- name: axiomatic_abp
num_bytes: 1204713
num_examples: 7
- name: lambda
num_bytes: 181085
num_examples: 10
- name: maths
num_bytes: 37685
num_examples: 3
- name: quicksort_complexity
num_bytes: 489164
num_examples: 28
- name: fssec_model
num_bytes: 1569135
num_examples: 25
- name: ipc
num_bytes: 3108901
num_examples: 31
- name: chinese
num_bytes: 208365
num_examples: 13
- name: cours_de_coq
num_bytes: 71295
num_examples: 11
- name: graphs
num_bytes: 644609
num_examples: 2
- name: dictionaries
num_bytes: 67746
num_examples: 1
- name: free_groups
num_bytes: 63973
num_examples: 1
- name: ramsey
num_bytes: 11734
num_examples: 1
- name: angles
num_bytes: 322579
num_examples: 5
- name: orb_stab
num_bytes: 204783
num_examples: 1
- name: qarith_stern_brocot
num_bytes: 10873352
num_examples: 35
- name: group_theory
num_bytes: 65160
num_examples: 5
- name: demos
num_bytes: 62208
num_examples: 5
- name: distributed_reference_counting
num_bytes: 20874
num_examples: 1
- name: subst
num_bytes: 362195
num_examples: 17
- name: miniml
num_bytes: 114099
num_examples: 1
- name: algebra
num_bytes: 3275753
num_examples: 65
- name: fermat4
num_bytes: 172156
num_examples: 5
- name: otway_rees
num_bytes: 226052
num_examples: 19
- name: PolTac
num_bytes: 157370
num_examples: 13
- name: fundamental_arithmetics
num_bytes: 308733
num_examples: 8
download_size: 11353498
dataset_size: 91221719
configs:
- config_name: default
data_files:
- split: finmap
path: data/finmap-*
- split: GeometricAlgebra
path: data/GeometricAlgebra-*
- split: bdds
path: data/bdds-*
- split: topology
path: data/topology-*
- split: euler_formula
path: data/euler_formula-*
- split: ruler_compass_geometry
path: data/ruler_compass_geometry-*
- split: twoSquare
path: data/twoSquare-*
- split: zfc
path: data/zfc-*
- split: shuffle
path: data/shuffle-*
- split: three_gap
path: data/three_gap-*
- split: regexp
path: data/regexp-*
- split: automata
path: data/automata-*
- split: izf
path: data/izf-*
- split: railroad_crossing
path: data/railroad_crossing-*
- split: idxassoc
path: data/idxassoc-*
- split: hoare_tut
path: data/hoare_tut-*
- split: lesniewski_mereology
path: data/lesniewski_mereology-*
- split: additions
path: data/additions-*
- split: checker
path: data/checker-*
- split: domain_theory
path: data/domain_theory-*
- split: propcalc
path: data/propcalc-*
- split: area_method
path: data/area_method-*
- split: ails
path: data/ails-*
- split: dep_map
path: data/dep_map-*
- split: markov
path: data/markov-*
- split: rsa
path: data/rsa-*
- split: goedel
path: data/goedel-*
- split: bigenough
path: data/bigenough-*
- split: generic_environments
path: data/generic_environments-*
- split: ctltctl
path: data/ctltctl-*
- split: schroeder
path: data/schroeder-*
- split: weak_up_to
path: data/weak_up_to-*
- split: groups
path: data/groups-*
- split: pocklington
path: data/pocklington-*
- split: mini_compiler
path: data/mini_compiler-*
- split: StructTact
path: data/StructTact-*
- split: exceptions
path: data/exceptions-*
- split: coqrel
path: data/coqrel-*
- split: higman_s
path: data/higman_s-*
- split: rem
path: data/rem-*
- split: tree_automata
path: data/tree_automata-*
- split: higman_cf
path: data/higman_cf-*
- split: coqoban
path: data/coqoban-*
- split: search_trees
path: data/search_trees-*
- split: ieee754
path: data/ieee754-*
- split: jordan_curve_theorem
path: data/jordan_curve_theorem-*
- split: hedges
path: data/hedges-*
- split: zorns_lemma
path: data/zorns_lemma-*
- split: tortoise_hare_algorithm
path: data/tortoise_hare_algorithm-*
- split: mod_red
path: data/mod_red-*
- split: traversable_fincontainer
path: data/traversable_fincontainer-*
- split: buchberger
path: data/buchberger-*
- split: constructive_geometry
path: data/constructive_geometry-*
- split: tarski_geometry
path: data/tarski_geometry-*
- split: int_map
path: data/int_map-*
- split: float
path: data/float-*
- split: InfSeqExt
path: data/InfSeqExt-*
- split: zchinese
path: data/zchinese-*
- split: smc
path: data/smc-*
- split: pts
path: data/pts-*
- split: param_pi
path: data/param_pi-*
- split: axiomatic_abp
path: data/axiomatic_abp-*
- split: lambda
path: data/lambda-*
- split: maths
path: data/maths-*
- split: quicksort_complexity
path: data/quicksort_complexity-*
- split: fssec_model
path: data/fssec_model-*
- split: ipc
path: data/ipc-*
- split: chinese
path: data/chinese-*
- split: cours_de_coq
path: data/cours_de_coq-*
- split: graphs
path: data/graphs-*
- split: dictionaries
path: data/dictionaries-*
- split: free_groups
path: data/free_groups-*
- split: ramsey
path: data/ramsey-*
- split: angles
path: data/angles-*
- split: orb_stab
path: data/orb_stab-*
- split: qarith_stern_brocot
path: data/qarith_stern_brocot-*
- split: group_theory
path: data/group_theory-*
- split: demos
path: data/demos-*
- split: distributed_reference_counting
path: data/distributed_reference_counting-*
- split: subst
path: data/subst-*
- split: miniml
path: data/miniml-*
- split: algebra
path: data/algebra-*
- split: fermat4
path: data/fermat4-*
- split: otway_rees
path: data/otway_rees-*
- split: PolTac
path: data/PolTac-*
- split: fundamental_arithmetics
path: data/fundamental_arithmetics-*
---
|
YidaM4396/Test | ---
license: mit
---
|
Mizukico/sd-config | ---
license: openrail
---
|
filwsyl/ascend | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- en
- zh
language_bcp47:
- en
- zh-CN
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
pretty_name: 'ASCEND: A Spontaneous Chinese-English Dataset for Code-switching in
Multi-turn Conversation'
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- automatic-speech-recognition
task_ids:
- code-switching
- speech-recognition
---
# Dataset Card for ASCEND
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** [Needs More Information]
- **Paper:** https://arxiv.org/abs/2112.06223
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
ASCEND (A Spontaneous Chinese-English Dataset) introduces a high-quality resource of spontaneous multi-turn conversational dialogue Chinese-English code-switching corpus collected in Hong Kong. ASCEND consists of 10.62 hours of spontaneous speech with a total of ~12.3K utterances. The corpus is split into 3 sets: training, validation, and test with a ratio of 8:1:1 while maintaining a balanced gender proportion on each set.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
Chinese and English
## Dataset Structure
### Data Instances
[Needs More Information]
### Data Fields
[Needs More Information]
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information] |
sunhaozhepy/ag_news_roberta_keywords_embeddings | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
- name: keywords
dtype: string
- name: keywords_embeddings
sequence: float32
splits:
- name: train
num_bytes: 401783299
num_examples: 120000
- name: test
num_bytes: 25437883
num_examples: 7600
download_size: 493380668
dataset_size: 427221182
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
suchirsalhan/MAO-CHILDES | ---
license: mit
---
|
open-llm-leaderboard/details_mosaicml__mpt-7b-instruct | ---
pretty_name: Evaluation run of mosaicml/mpt-7b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mosaicml/mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T07:03:23.990596](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-instruct/blob/main/results_2023-09-23T07-03-23.990596.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2429739932885906,\n\
\ \"em_stderr\": 0.004392127579519805,\n \"f1\": 0.2939712667785233,\n\
\ \"f1_stderr\": 0.004382684089142145,\n \"acc\": 0.3664330383509068,\n\
\ \"acc_stderr\": 0.00868382013779556\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2429739932885906,\n \"em_stderr\": 0.004392127579519805,\n\
\ \"f1\": 0.2939712667785233,\n \"f1_stderr\": 0.004382684089142145\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \
\ \"acc_stderr\": 0.0045482295338363475\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754772\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mosaicml/mpt-7b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|arc:challenge|25_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T07_03_23.990596
path:
- '**/details_harness|drop|3_2023-09-23T07-03-23.990596.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T07-03-23.990596.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T07_03_23.990596
path:
- '**/details_harness|gsm8k|5_2023-09-23T07-03-23.990596.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T07-03-23.990596.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hellaswag|10_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T10:01:10.556120.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T10:01:10.556120.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T07_03_23.990596
path:
- '**/details_harness|winogrande|5_2023-09-23T07-03-23.990596.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T07-03-23.990596.parquet'
- config_name: results
data_files:
- split: 2023_07_20T10_01_10.556120
path:
- results_2023-07-20T10:01:10.556120.parquet
- split: 2023_09_23T07_03_23.990596
path:
- results_2023-09-23T07-03-23.990596.parquet
- split: latest
path:
- results_2023-09-23T07-03-23.990596.parquet
---
# Dataset Card for Evaluation run of mosaicml/mpt-7b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mosaicml/mpt-7b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T07:03:23.990596](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-instruct/blob/main/results_2023-09-23T07-03-23.990596.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2429739932885906,
"em_stderr": 0.004392127579519805,
"f1": 0.2939712667785233,
"f1_stderr": 0.004382684089142145,
"acc": 0.3664330383509068,
"acc_stderr": 0.00868382013779556
},
"harness|drop|3": {
"em": 0.2429739932885906,
"em_stderr": 0.004392127579519805,
"f1": 0.2939712667785233,
"f1_stderr": 0.004382684089142145
},
"harness|gsm8k|5": {
"acc": 0.028051554207733132,
"acc_stderr": 0.0045482295338363475
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754772
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mtalrefaie/arallama-dataset-v1.0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: warc_headers
struct:
- name: warc-record-id
dtype: string
- name: warc-date
dtype: string
- name: content-type
dtype: string
- name: content-length
dtype: int32
- name: warc-type
dtype: string
- name: warc-identified-content-language
dtype: string
- name: warc-refers-to
dtype: string
- name: warc-target-uri
dtype: string
- name: warc-block-digest
dtype: string
- name: identification
struct:
- name: label
dtype: string
- name: prob
dtype: float32
- name: harmful_pp
dtype: float32
- name: tlsh
dtype: string
- name: quality_warnings
sequence: string
- name: categories
sequence: string
- name: sentence_identifications
list:
- name: label
dtype: string
- name: prob
dtype: float32
splits:
- name: train
num_bytes: 63195041070
num_examples: 4975268
download_size: 29939473314
dataset_size: 63195041070
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_mistralai__Mixtral-8x7B-Instruct-v0.1 | ---
pretty_name: Evaluation run of mistralai/Mixtral-8x7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mistralai__Mixtral-8x7B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T04:20:22.140239](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mixtral-8x7B-Instruct-v0.1/blob/main/results_2024-01-05T04-20-22.140239.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7126033327488117,\n\
\ \"acc_stderr\": 0.030215739102142546,\n \"acc_norm\": 0.7164863994184663,\n\
\ \"acc_norm_stderr\": 0.030796061622697008,\n \"mc1\": 0.5006119951040392,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.649788114114722,\n\
\ \"mc2_stderr\": 0.015119260704075871\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441377,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.013374078615068738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6858195578570006,\n\
\ \"acc_stderr\": 0.004632399677490809,\n \"acc_norm\": 0.8755228042222665,\n\
\ \"acc_norm_stderr\": 0.003294504807555227\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.03078373675774564,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.03078373675774564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8516129032258064,\n \"acc_stderr\": 0.020222737554330378,\n \"\
acc_norm\": 0.8516129032258064,\n \"acc_norm_stderr\": 0.020222737554330378\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n \"\
acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n\
\ \"acc_stderr\": 0.024063156416822523,\n \"acc_norm\": 0.8686868686868687,\n\
\ \"acc_norm_stderr\": 0.024063156416822523\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.01438543285747646,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.01438543285747646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377272,\n \
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205145,\n \
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205145\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.013708749534172636,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.013708749534172636\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971716,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971716\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n\
\ \"acc_stderr\": 0.011622736692041287,\n \"acc_norm\": 0.879948914431673,\n\
\ \"acc_norm_stderr\": 0.011622736692041287\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617897,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617897\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n\
\ \"acc_stderr\": 0.016669799592112032,\n \"acc_norm\": 0.46033519553072627,\n\
\ \"acc_norm_stderr\": 0.016669799592112032\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.02198603218206415,\n\
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.02198603218206415\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n\
\ \"acc_stderr\": 0.022827317491059686,\n \"acc_norm\": 0.797427652733119,\n\
\ \"acc_norm_stderr\": 0.022827317491059686\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060006,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060006\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5443285528031291,\n\
\ \"acc_stderr\": 0.012719949543032228,\n \"acc_norm\": 0.5443285528031291,\n\
\ \"acc_norm_stderr\": 0.012719949543032228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02456220431414231,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02456220431414231\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5006119951040392,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.649788114114722,\n\
\ \"mc2_stderr\": 0.015119260704075871\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989247\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.01342838248127424\n }\n}\n```"
repo_url: https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-20-22.140239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-20-22.140239.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- '**/details_harness|winogrande|5_2024-01-05T04-20-22.140239.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T04-20-22.140239.parquet'
- config_name: results
data_files:
- split: 2024_01_05T04_20_22.140239
path:
- results_2024-01-05T04-20-22.140239.parquet
- split: latest
path:
- results_2024-01-05T04-20-22.140239.parquet
---
# Dataset Card for Evaluation run of mistralai/Mixtral-8x7B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mistralai__Mixtral-8x7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T04:20:22.140239](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mixtral-8x7B-Instruct-v0.1/blob/main/results_2024-01-05T04-20-22.140239.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7126033327488117,
"acc_stderr": 0.030215739102142546,
"acc_norm": 0.7164863994184663,
"acc_norm_stderr": 0.030796061622697008,
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.649788114114722,
"mc2_stderr": 0.015119260704075871
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441377,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.013374078615068738
},
"harness|hellaswag|10": {
"acc": 0.6858195578570006,
"acc_stderr": 0.004632399677490809,
"acc_norm": 0.8755228042222665,
"acc_norm_stderr": 0.003294504807555227
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.03078373675774564,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.03078373675774564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8516129032258064,
"acc_stderr": 0.020222737554330378,
"acc_norm": 0.8516129032258064,
"acc_norm_stderr": 0.020222737554330378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822523,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.01438543285747646,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.01438543285747646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205145,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205145
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.013708749534172636,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.013708749534172636
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971716,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971716
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436193,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436193
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041287,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041287
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617897,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617897
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.016669799592112032,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.016669799592112032
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.02198603218206415,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.02198603218206415
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.022827317491059686,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.022827317491059686
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060006,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060006
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5443285528031291,
"acc_stderr": 0.012719949543032228,
"acc_norm": 0.5443285528031291,
"acc_norm_stderr": 0.012719949543032228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.02456220431414231,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.02456220431414231
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.649788114114722,
"mc2_stderr": 0.015119260704075871
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989247
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.01342838248127424
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_01-ai__Yi-9B | ---
pretty_name: Evaluation run of 01-ai/Yi-9B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [01-ai/Yi-9B](https://huggingface.co/01-ai/Yi-9B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_01-ai__Yi-9B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T00:53:16.402231](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-9B/blob/main/results_2024-03-07T00-53-16.402231.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6944616852197417,\n\
\ \"acc_stderr\": 0.030657841258996534,\n \"acc_norm\": 0.7006110175468258,\n\
\ \"acc_norm_stderr\": 0.031247040652107712,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.42448173501507824,\n\
\ \"mc2_stderr\": 0.014715889893086314\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.01445686294465065,\n\
\ \"acc_norm\": 0.6117747440273038,\n \"acc_norm_stderr\": 0.014241614207414044\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5887273451503684,\n\
\ \"acc_stderr\": 0.004910588449330019,\n \"acc_norm\": 0.7881896036646087,\n\
\ \"acc_norm_stderr\": 0.00407756134927239\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7191489361702128,\n \"acc_stderr\": 0.029379170464124818,\n\
\ \"acc_norm\": 0.7191489361702128,\n \"acc_norm_stderr\": 0.029379170464124818\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6005291005291006,\n \"acc_stderr\": 0.02522545028406793,\n \"\
acc_norm\": 0.6005291005291006,\n \"acc_norm_stderr\": 0.02522545028406793\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.5873015873015873,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8451612903225807,\n\
\ \"acc_stderr\": 0.020579287326583227,\n \"acc_norm\": 0.8451612903225807,\n\
\ \"acc_norm_stderr\": 0.020579287326583227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5812807881773399,\n \"acc_stderr\": 0.034711928605184676,\n\
\ \"acc_norm\": 0.5812807881773399,\n \"acc_norm_stderr\": 0.034711928605184676\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\"\
: 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.02519092111460391,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.02519092111460391\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240514,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240514\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7589743589743589,\n \"acc_stderr\": 0.02168554666533319,\n \
\ \"acc_norm\": 0.7589743589743589,\n \"acc_norm_stderr\": 0.02168554666533319\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.02986960509531691,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.02986960509531691\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.024762902678057922,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.024762902678057922\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849927,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849927\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8678899082568807,\n \"acc_stderr\": 0.014517801914598238,\n \"\
acc_norm\": 0.8678899082568807,\n \"acc_norm_stderr\": 0.014517801914598238\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.03167468706828978,\n \"\
acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.03167468706828978\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801598,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801598\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\
\ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\
\ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.01872430174194166,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.01872430174194166\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8390804597701149,\n\
\ \"acc_stderr\": 0.013140225515611724,\n \"acc_norm\": 0.8390804597701149,\n\
\ \"acc_norm_stderr\": 0.013140225515611724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451163,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284066,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284066\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49478487614080835,\n\
\ \"acc_stderr\": 0.012769541449652547,\n \"acc_norm\": 0.49478487614080835,\n\
\ \"acc_norm_stderr\": 0.012769541449652547\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.704248366013072,\n \"acc_stderr\": 0.01846315413263281,\n \
\ \"acc_norm\": 0.704248366013072,\n \"acc_norm_stderr\": 0.01846315413263281\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.42448173501507824,\n\
\ \"mc2_stderr\": 0.014715889893086314\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.01173504356412674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.489764973464746,\n \
\ \"acc_stderr\": 0.013769598923012397\n }\n}\n```"
repo_url: https://huggingface.co/01-ai/Yi-9B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|arc:challenge|25_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|gsm8k|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hellaswag|10_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T00-53-16.402231.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T00-53-16.402231.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- '**/details_harness|winogrande|5_2024-03-07T00-53-16.402231.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T00-53-16.402231.parquet'
- config_name: results
data_files:
- split: 2024_03_07T00_53_16.402231
path:
- results_2024-03-07T00-53-16.402231.parquet
- split: latest
path:
- results_2024-03-07T00-53-16.402231.parquet
---
# Dataset Card for Evaluation run of 01-ai/Yi-9B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [01-ai/Yi-9B](https://huggingface.co/01-ai/Yi-9B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_01-ai__Yi-9B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T00:53:16.402231](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-9B/blob/main/results_2024-03-07T00-53-16.402231.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6944616852197417,
"acc_stderr": 0.030657841258996534,
"acc_norm": 0.7006110175468258,
"acc_norm_stderr": 0.031247040652107712,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42448173501507824,
"mc2_stderr": 0.014715889893086314
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.01445686294465065,
"acc_norm": 0.6117747440273038,
"acc_norm_stderr": 0.014241614207414044
},
"harness|hellaswag|10": {
"acc": 0.5887273451503684,
"acc_stderr": 0.004910588449330019,
"acc_norm": 0.7881896036646087,
"acc_norm_stderr": 0.00407756134927239
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.042667634040995814,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.042667634040995814
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7191489361702128,
"acc_stderr": 0.029379170464124818,
"acc_norm": 0.7191489361702128,
"acc_norm_stderr": 0.029379170464124818
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6005291005291006,
"acc_stderr": 0.02522545028406793,
"acc_norm": 0.6005291005291006,
"acc_norm_stderr": 0.02522545028406793
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5812807881773399,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.5812807881773399,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.02519092111460391,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.02519092111460391
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240514,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240514
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7589743589743589,
"acc_stderr": 0.02168554666533319,
"acc_norm": 0.7589743589743589,
"acc_norm_stderr": 0.02168554666533319
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.02986960509531691,
"acc_norm": 0.4,
"acc_norm_stderr": 0.02986960509531691
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.024762902678057922,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.024762902678057922
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849927,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849927
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8678899082568807,
"acc_stderr": 0.014517801914598238,
"acc_norm": 0.8678899082568807,
"acc_norm_stderr": 0.014517801914598238
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.03167468706828978,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.03167468706828978
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801598,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801598
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194166,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8390804597701149,
"acc_stderr": 0.013140225515611724,
"acc_norm": 0.8390804597701149,
"acc_norm_stderr": 0.013140225515611724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451163,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49478487614080835,
"acc_stderr": 0.012769541449652547,
"acc_norm": 0.49478487614080835,
"acc_norm_stderr": 0.012769541449652547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.704248366013072,
"acc_stderr": 0.01846315413263281,
"acc_norm": 0.704248366013072,
"acc_norm_stderr": 0.01846315413263281
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401705,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401705
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42448173501507824,
"mc2_stderr": 0.014715889893086314
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.01173504356412674
},
"harness|gsm8k|5": {
"acc": 0.489764973464746,
"acc_stderr": 0.013769598923012397
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mboth/kaelteErzeugen-50-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Kaelteanlage
'1': KaeltekreisAllgemein
'2': Kaeltemaschine
'3': Kaeltemengenzaehler
'4': Klappe
'5': Pumpe
'6': RKW
'7': Regler
'8': Ruecklauf
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 72126.24090121317
num_examples: 293
- name: test
num_bytes: 18282
num_examples: 73
- name: valid
num_bytes: 18282
num_examples: 73
download_size: 54220
dataset_size: 108690.24090121317
---
# Dataset Card for "kaelteErzeugen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baptistecolle/sam-controlnet-2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: filepath
dtype: string
- name: sentids
list: int32
- name: filename
dtype: string
- name: imgid
dtype: int32
- name: split
dtype: string
- name: sentences
struct:
- name: tokens
list: string
- name: raw
dtype: string
- name: imgid
dtype: int32
- name: sentid
dtype: int32
- name: cocoid
dtype: int32
splits:
- name: original
num_bytes: 160172316.0
num_examples: 1000
download_size: 0
dataset_size: 160172316.0
---
# Dataset Card for "sam-controlnet-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dura-garage/nep-spell-eval-dataset-6200 | ---
license: mit
---
|
tr416/_dataset_20231007_153958 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73889
dataset_size: 770400.0
---
# Dataset Card for "_dataset_20231007_153958"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mukayese/truthful_qa-tr | ---
license: apache-2.0
---
|
Sleoruiz/disc_cla_primera-2 | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: comision
dtype: string
- name: fecha_gaceta
dtype: string
- name: gaceta_numero
dtype: string
- name: name
dtype: string
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 32328969
num_examples: 11713
download_size: 16700030
dataset_size: 32328969
---
# Dataset Card for "disc_cla_primera-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pablao0948/Marina_Sena | ---
license: openrail
---
|
arbml/TinyCalliFont | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 489803410.8
num_examples: 1600
- name: test
num_bytes: 61689478.0
num_examples: 200
- name: validation
num_bytes: 62852027.0
num_examples: 200
download_size: 617998205
dataset_size: 614344915.8
---
# Dataset Card for "TinyCalliFont"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713002663 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2460067
num_examples: 7251
download_size: 1409617
dataset_size: 2460067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lightblue/gpt4_conversations_multilingual | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 97523643.84159379
num_examples: 9217
download_size: 43657898
dataset_size: 97523643.84159379
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_RatanRohith__NeuralPizza-Valor-7B-Merge-slerp | ---
pretty_name: Evaluation run of RatanRohith/NeuralPizza-Valor-7B-Merge-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RatanRohith/NeuralPizza-Valor-7B-Merge-slerp](https://huggingface.co/RatanRohith/NeuralPizza-Valor-7B-Merge-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RatanRohith__NeuralPizza-Valor-7B-Merge-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T23:00:14.356889](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-Valor-7B-Merge-slerp/blob/main/results_2024-01-22T23-00-14.356889.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/RatanRohith/NeuralPizza-Valor-7B-Merge-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|arc:challenge|25_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|gsm8k|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hellaswag|10_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-00-14.356889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T23-00-14.356889.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- '**/details_harness|winogrande|5_2024-01-22T23-00-14.356889.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T23-00-14.356889.parquet'
- config_name: results
data_files:
- split: 2024_01_22T23_00_14.356889
path:
- results_2024-01-22T23-00-14.356889.parquet
- split: latest
path:
- results_2024-01-22T23-00-14.356889.parquet
---
# Dataset Card for Evaluation run of RatanRohith/NeuralPizza-Valor-7B-Merge-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RatanRohith/NeuralPizza-Valor-7B-Merge-slerp](https://huggingface.co/RatanRohith/NeuralPizza-Valor-7B-Merge-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RatanRohith__NeuralPizza-Valor-7B-Merge-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T23:00:14.356889](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-Valor-7B-Merge-slerp/blob/main/results_2024-01-22T23-00-14.356889.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1 | ---
pretty_name: Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zyh3826/llama2-13b-ft-openllm-leaderboard-v1](https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T15:33:42.644192](https://huggingface.co/datasets/open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1/blob/main/results_2023-12-09T15-33-42.644192.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6016495918139398,\n\
\ \"acc_stderr\": 0.03270798736533002,\n \"acc_norm\": 0.612894192678486,\n\
\ \"acc_norm_stderr\": 0.033541474205616734,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.40723683293857477,\n\
\ \"mc2_stderr\": 0.01336809717170015\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526842,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.01433715891426844\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6276638119896435,\n\
\ \"acc_stderr\": 0.004824393076826628,\n \"acc_norm\": 0.8314080860386377,\n\
\ \"acc_norm_stderr\": 0.0037362592995204874\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983056,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983056\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n\
\ \"acc_stderr\": 0.02556060472102288,\n \"acc_norm\": 0.7193548387096774,\n\
\ \"acc_norm_stderr\": 0.02556060472102288\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218957,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218957\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016015,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016015\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162111,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162111\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165538,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165538\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n\
\ \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n\
\ \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n\
\ \"acc_stderr\": 0.012727084826799798,\n \"acc_norm\": 0.4589308996088657,\n\
\ \"acc_norm_stderr\": 0.012727084826799798\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073063,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073063\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768907,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768907\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.40723683293857477,\n\
\ \"mc2_stderr\": 0.01336809717170015\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698329\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \
\ \"acc_stderr\": 0.0031957470754808027\n }\n}\n```"
repo_url: https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-33-42.644192.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- '**/details_harness|winogrande|5_2023-12-09T15-33-42.644192.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T15-33-42.644192.parquet'
- config_name: results
data_files:
- split: 2023_12_09T15_33_42.644192
path:
- results_2023-12-09T15-33-42.644192.parquet
- split: latest
path:
- results_2023-12-09T15-33-42.644192.parquet
---
# Dataset Card for Evaluation run of zyh3826/llama2-13b-ft-openllm-leaderboard-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zyh3826/llama2-13b-ft-openllm-leaderboard-v1](https://huggingface.co/zyh3826/llama2-13b-ft-openllm-leaderboard-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T15:33:42.644192](https://huggingface.co/datasets/open-llm-leaderboard/details_zyh3826__llama2-13b-ft-openllm-leaderboard-v1/blob/main/results_2023-12-09T15-33-42.644192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6016495918139398,
"acc_stderr": 0.03270798736533002,
"acc_norm": 0.612894192678486,
"acc_norm_stderr": 0.033541474205616734,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.40723683293857477,
"mc2_stderr": 0.01336809717170015
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526842,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.01433715891426844
},
"harness|hellaswag|10": {
"acc": 0.6276638119896435,
"acc_stderr": 0.004824393076826628,
"acc_norm": 0.8314080860386377,
"acc_norm_stderr": 0.0037362592995204874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983056,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102288,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102288
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218957,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218957
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394849,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394849
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712996,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712996
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016015,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016015
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162111,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162111
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165538,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165538
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799798,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073063,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073063
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768907,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768907
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.40723683293857477,
"mc2_stderr": 0.01336809717170015
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698329
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754808027
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Felipefloke/audiomodelo | ---
license: openrail
---
|
Minglii/r_pv4_wiz70 | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 118128783
num_examples: 46064
download_size: 57212134
dataset_size: 118128783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "r_pv4_wiz70"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nyntany/REPO | ---
license: openrail
---
|
Math47/y | ---
license: openrail
---
|
Shularp/BDMS02_TH_AR_unchanged_with_quotation | ---
dataset_info:
features:
- name: th
dtype: string
- name: ar
dtype: string
splits:
- name: train
num_bytes: 19731
num_examples: 85
download_size: 9903
dataset_size: 19731
---
# Dataset Card for "BDMS02_TH_AR_unchanged_with_quotation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PJMixers/coedit-reworded-deduped-multiturn-sharegpt | ---
language:
- en
size_categories:
- 1K<n<10K
---
Each sample contains 1 to 32 pairs.
```
pair_lengths Minimum: 44
pair_lengths Maximum: 2573
pair_lengths Average: 1053
turn_counts Minimum: 1
turn_counts Maximum: 32
turn_counts Average: 16.5
```
- *`pair_lengths` counted using metharme tags, and mistral tokenizer*
- *`turn_counts` counted for pairs of human/gpt* |
pminervini/averitec | ---
dataset_info:
features:
- name: cached_original_claim_url
dtype: string
- name: speaker
dtype: string
- name: required_reannotation
dtype: bool
- name: reporting_source
dtype: string
- name: label
dtype: string
- name: claim_types
sequence: string
- name: fact_checking_article
dtype: string
- name: fact_checking_strategies
sequence: string
- name: claim
dtype: string
- name: justification
dtype: string
- name: location_ISO_code
dtype: string
- name: original_claim_url
dtype: string
- name: questions
list:
- name: answers
list:
- name: answer
dtype: string
- name: answer_type
dtype: string
- name: boolean_explanation
dtype: string
- name: cached_source_url
dtype: string
- name: source_medium
dtype: string
- name: source_url
dtype: string
- name: question
dtype: string
- name: claim_date
dtype: string
splits:
- name: train
num_bytes: 6038474
num_examples: 3068
- name: dev
num_bytes: 1087211
num_examples: 500
download_size: 3363933
dataset_size: 7125685
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
---
|
Mihaiii/OpenHermes-2.5-1k-longest-curated | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4176433
num_examples: 519
download_size: 1835764
dataset_size: 4176433
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is a dataset that was created from [HuggingFaceH4/OpenHermes-2.5-1k-longest](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest).
The purpose is to be able to use in [axolotl](https://github.com/OpenAccess-AI-Collective/axolotl) config by adding:
```yaml
datasets:
- path: Mihaiii/OpenHermes-2.5-1k-longest-curated
type: alpaca
```
I elimininated rows that:
1) Had sys prompt (only 3 rows eliminated)
2) Contained on output a character that is repeated 10 times in a row (478 rows eliminated)
So from a 1000 rows dataset, I ended up with a 519 rows dataset.
See the [OpenHermes-2.5-1k-longest-curated.ipynb](https://huggingface.co/datasets/Mihaiii/OpenHermes-2.5-1k-longest-curated/blob/main/OpenHermes-2.5-1k-longest-curated.ipynb) notebook for details on how the dataset was constructed.
**Later edit**: after a more in depth analysis on the dataset, I noticed that:
1) The imported subset is `test_sft`, but this is the 2nd chunk of top 1k records. The first one is in `train_sft` subset.
2) Valid code records that contained 10 repeated spaces for indentation were also eliminated. |
CyberHarem/barbara_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of barbara/バーバラ/芭芭拉 (Genshin Impact)
This is the dataset of barbara/バーバラ/芭芭拉 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `twintails, blonde_hair, blue_eyes, long_hair, drill_hair, twin_drills, hat, bow, white_headwear, breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.11 GiB | [Download](https://huggingface.co/datasets/CyberHarem/barbara_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 930.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/barbara_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1420 | 1.90 GiB | [Download](https://huggingface.co/datasets/CyberHarem/barbara_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/barbara_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, long_sleeves, looking_at_viewer, simple_background, smile, solo, white_background, white_dress, white_pantyhose, ass, blush, closed_mouth, detached_sleeves, looking_back, detached_collar, open_mouth |
| 1 | 14 |  |  |  |  |  | 1girl, bare_shoulders, detached_collar, detached_sleeves, long_sleeves, looking_at_viewer, open_mouth, simple_background, smile, solo, white_dress, white_background, white_pantyhose, ;d, one_eye_closed, blush, book, frilled_dress, vision_(genshin_impact), latin_cross, strapless_dress, white_sleeves |
| 2 | 6 |  |  |  |  |  | 1girl, :d, detached_sleeves, long_sleeves, looking_at_viewer, open_mouth, solo, white_background, white_dress, bare_shoulders, blush, book, simple_background, vision_(genshin_impact), cross |
| 3 | 5 |  |  |  |  |  | 1girl, :d, bare_shoulders, detached_collar, hair_between_eyes, long_sleeves, looking_at_viewer, open_mouth, simple_background, solo, upper_body, white_background, white_dress, detached_sleeves, blush, cross, hand_up, teeth |
| 4 | 7 |  |  |  |  |  | 1girl, :d, bare_shoulders, blush, detached_collar, detached_sleeves, long_sleeves, looking_at_viewer, open_mouth, solo, white_dress, sidelocks, book, blurry, hair_between_eyes, cross, white_pantyhose |
| 5 | 6 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, blue_sky, casual_one-piece_swimsuit, choker, cloud, day, detached_sleeves, long_sleeves, looking_at_viewer, official_alternate_costume, open_mouth, outdoors, sailor_hat, solo, thigh_strap, water, :d, duck, ocean, wading, bag, blue_sailor_collar, bowtie, hair_flower, blush |
| 6 | 6 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, blue_sky, casual_one-piece_swimsuit, covered_navel, day, detached_sleeves, duck, long_sleeves, looking_at_viewer, official_alternate_costume, open_mouth, outdoors, sailor_hat, smile, solo, ;d, bare_shoulders, blue_sailor_collar, blush, bowtie, choker, cloud, cowboy_shot, ocean, one_eye_closed, thigh_strap, water, animal_bag, hair_flower, sunlight, wading, yellow_bow |
| 7 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, detached_collar, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, simple_background, solo, strapless_leotard, white_background, blush, wrist_cuffs, bowtie, sitting, alternate_costume, closed_mouth, covered_navel, medium_breasts, pantyhose, thighs, white_leotard |
| 8 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, looking_at_viewer, navel, solo, collarbone, stomach, sitting, large_breasts, medium_breasts, black_bra, black_panties, parted_lips, sidelocks, strap_slip, thighs, underwear_only, white_bra, white_panties |
| 9 | 7 |  |  |  |  |  | 1girl, alternate_costume, long_sleeves, looking_at_viewer, open_mouth, solo, blush, bowtie, pleated_skirt, white_shirt, :d, blue_skirt, school_uniform, blue_bow, collared_shirt, thighhighs, hand_up, jacket, open_clothes, simple_background, upper_teeth_only, white_pantyhose, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | long_sleeves | looking_at_viewer | simple_background | smile | solo | white_background | white_dress | white_pantyhose | ass | blush | closed_mouth | detached_sleeves | looking_back | detached_collar | open_mouth | ;d | one_eye_closed | book | frilled_dress | vision_(genshin_impact) | latin_cross | strapless_dress | white_sleeves | :d | cross | hair_between_eyes | upper_body | hand_up | teeth | sidelocks | blurry | blue_one-piece_swimsuit | blue_sky | casual_one-piece_swimsuit | choker | cloud | day | official_alternate_costume | outdoors | sailor_hat | thigh_strap | water | duck | ocean | wading | bag | blue_sailor_collar | bowtie | hair_flower | covered_navel | cowboy_shot | animal_bag | sunlight | yellow_bow | cleavage | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | sitting | alternate_costume | medium_breasts | pantyhose | thighs | white_leotard | navel | collarbone | stomach | large_breasts | black_bra | black_panties | parted_lips | strap_slip | underwear_only | white_bra | white_panties | pleated_skirt | white_shirt | blue_skirt | school_uniform | blue_bow | collared_shirt | thighhighs | jacket | open_clothes | upper_teeth_only | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------------|:--------------------|:--------|:-------|:-------------------|:--------------|:------------------|:------|:--------|:---------------|:-------------------|:---------------|:------------------|:-------------|:-----|:-----------------|:-------|:----------------|:--------------------------|:--------------|:------------------|:----------------|:-----|:--------|:--------------------|:-------------|:----------|:--------|:------------|:---------|:--------------------------|:-----------|:----------------------------|:---------|:--------|:------|:-----------------------------|:-----------|:-------------|:--------------|:--------|:-------|:--------|:---------|:------|:---------------------|:---------|:--------------|:----------------|:--------------|:-------------|:-----------|:-------------|:-----------|:-------------------|:----------------|:--------------|:--------------------|:--------------|:----------|:--------------------|:-----------------|:------------|:---------|:----------------|:--------|:-------------|:----------|:----------------|:------------|:----------------|:--------------|:-------------|:-----------------|:------------|:----------------|:----------------|:--------------|:-------------|:-----------------|:-----------|:-----------------|:-------------|:---------|:---------------|:-------------------|:-----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | X | | X | | | X | | | X | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | X | | X | | X | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | | | X | | X | X | | X | | X | | X | X | | | X | | | | | | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | X | | | X | | | | | X | | X | | | X | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | X | X | | X | X | | | | | X | | X | | | X | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | X | X | | X | X | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | X | X | X | | X | | | X | | X | | | | | X | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
uetchy/thesession | ---
license: odbl
---
|
hayden-donnelly/remilio | ---
license: other
license_name: viral-public-license
license_link: LICENSE
size_categories:
- 1K<n<10K
task_categories:
- image-classification
- unconditional-image-generation
- text-to-image
language:
- en
pretty_name: Remilio
---
# Remilio
[Redacted Remilio Babies](https://remilio.org/) is a collection of 10,000 neochibi pfpNFT's evolving the
proven Milady Maker paradigm with the introduction of young J.I.T. energy, schizophrenic reactionary aesthetics,
and digital sales terrorism.
 |
Aruno/guanaco_jp | ---
license: apache-2.0
task_categories:
- text-generation
language:
- ja
pretty_name: Guanaco Japanese Prompt
---
Japanese Prompt of [GuanacoDataset](https://huggingface.co/datasets/JosephusCheung/GuanacoDataset) extracted using `langdetect`. |
wenzhuoliu/MOOC_Answer_Reformulation | ---
dataset_info:
config_name: Query_Answers
features:
- name: query
dtype: string
- name: input_docs
sequence: string
- name: gpt_answer
dtype: string
splits:
- name: religion
num_bytes: 162873
num_examples: 124
- name: python
num_bytes: 84949
num_examples: 79
- name: history
num_bytes: 50557
num_examples: 58
- name: recherche_reproductible
num_bytes: 59372
num_examples: 52
download_size: 488352
dataset_size: 408308
configs:
- config_name: Query_Answers
data_files:
- split: history
path: Query_Answers/history-*
- split: religion
path: Query_Answers/religion-*
- split: recherche_reproductible
path: Query_Answers/recherche_reproductible-*
- split: python
path: Query_Answers/python-*
---
|
cogumelo22/cogumelo22 | ---
license: openrail
---
|
Arthur91284/Arthur91284 | ---
license: openrail
---
|
lilacai/lilac-dolphin | ---
tags:
- Lilac
---
# lilac/dolphin
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/cognitivecomputations/dolphin](https://huggingface.co/datasets/cognitivecomputations/dolphin)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-dolphin
```
or from python with:
```py
ll.download("lilacai/lilac-dolphin")
```
|
pvduy/ultra-mix-7k-code | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: content
dtype: string
- name: is_code
dtype: bool
splits:
- name: train
num_bytes: 71828083
num_examples: 8228
download_size: 33892874
dataset_size: 71828083
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Gabriel1322/xit | ---
license: openrail
---
|
sohonjit/brats2023_multidomain_i2i | ---
license: mit
task_categories:
- image-to-image
language:
- en
tags:
- medical
---
## Dataset Description
- **Paper:** Under Review.
- **Point of Contact:** Arijit Ghosh, arijit.ghosh@fau.de
### Dataset Summary
This dataset is based on the BraTS2023 dataset and is supposed to be used for Multi-domain Image-to-Image Translation task. It takes 5 middle slices from each nifti volume of the BraTS2023 dataset after normalizing to a value of (-1,1). All of these images are `.npy` files and one can load them using the `np.load(FILEPATH).astype(np.float32)`. We provide the training and the test set which contains 6255 and 1095 files respectively for each domain. These are actually 4 domains, and are named accordingly.
It is highly recommend to create a separate validation set from the training dataset for applications. We use `Pytorch` to do this. We do this by using the following command.
```python
seed = 97
train_dataset, val_dataset = torch.utils.data.random_split(
dataset, lengths=(0.9, 0.1), generator=torch.Generator().manual_seed(seed)
) # dataset is the dataset instance.
```
This dataset is actually part of a paper which is under peer-review currently.
We hope this helps the community. |
bigscience-data/roots_eu_wikibooks | ---
language: eu
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_eu_wikibooks
# wikibooks_filtered
- Dataset uid: `wikibooks_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0897 % of total
- 0.2591 % of en
- 0.0965 % of fr
- 0.1691 % of es
- 0.2834 % of indic-hi
- 0.2172 % of pt
- 0.0149 % of zh
- 0.0279 % of ar
- 0.1374 % of vi
- 0.5025 % of id
- 0.3694 % of indic-ur
- 0.5744 % of eu
- 0.0769 % of ca
- 0.0519 % of indic-ta
- 0.1470 % of indic-mr
- 0.0751 % of indic-te
- 0.0156 % of indic-bn
- 0.0476 % of indic-ml
- 0.0087 % of indic-pa
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-pa
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
nikchar/konstantinaki_paper_test_top_5 | ---
dataset_info:
features:
- name: label
dtype: string
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence
sequence: string
- name: retrieved_evidence_text
sequence: string
splits:
- name: train
num_bytes: 50870680
num_examples: 11073
download_size: 21296684
dataset_size: 50870680
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "konstantinaki_paper_test_top_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akkang321/testlecode | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5818
num_examples: 30
download_size: 3673
dataset_size: 5818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amd-nicknick/bert-base-uncased-2022_tokenized_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 25352412298
num_examples: 80462898
download_size: 6782996622
dataset_size: 25352412298
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
avaliev/drugchat | ---
license: bsd-3-clause
task_categories:
- question-answering
language:
- en
tags:
- biology
- chemistry
- medical
pretty_name: DrugChat Dataset
size_categories:
- 10K<n<100K
---
This dataset is for the "DrugChat: Towards Enabling ChatGPT-Like Capabilities on Drug Molecule Graphs" paper.
@article{liang2023drugchat,
title={DrugChat: Towards Enabling ChatGPT-Like Capabilities on Drug Molecule Graphs},
author={Liang, Youwei and Zhang, Ruiyi and Zhang, li and Xie, Pengtao},
journal={TechRxiv},
year={2023}
}
There are no changes in data; only added datasets class to download the set from HF and generate split from data in .json files. |
hzydashi/KatagoExceptGo | ---
license: mit
tags:
- alphago
- alphazero
- board games
- gomoku
- katago
- NNUE
- Stockfish
size_categories:
- 100M<n<1B
--- |
Iftitahu/sundanese_instruct_stories | ---
license: cc-by-4.0
task_categories:
- translation
- text-generation
- text2text-generation
language:
- id
- jv
- su
- en
size_categories:
- n<1K
---
A dataset of parallel translation-based instructions for Sundanese language as a target language. </br>
Materials are taken from randomly selected children stories at https://storyweaver.org.in, under CC-By-SA-4.0 license. </br>
The template IDs are:</br>
(1, 'Tarjamahkeun teks dongeng barudak di handap tina teks basa Inggris kana teks basa Sunda:', 'Tarjamahan atawa sasaruaan naskah dina basa Sunda:'),</br>
(2, 'Tarjamahkeun teks dongeng barudak di handap tina teks basa Indonesia kana teks basa Sunda:', 'Tarjamahan atawa sasaruaan naskah dina basa Sunda:'),</br>
(3, 'Tarjamahkeun teks dongeng barudak di handap tina teks basa Jawa kana teks basa Sunda:', 'Tarjamahan atawa sasaruaan naskah dina basa Sunda:'),</br></br>
Data is mainly composed of three parallel language samples as prompt inputs and target completions:</br>
1. <b>en_sunda</b></br>
Prompt/instruction language: <i>Sundanese</i></br>
Source/input language:<i>English</i></br>
Target/output language:<i>Sundanese</i></br>
Size: 94 samples.</br>
Prompt Template:</br>
<i>inputs</i>:</br>
Tarjamahkeun teks dongeng barudak di handap tina teks basa Inggris kana teks basa Sunda:\n\n{input}\n\n</br>
<i>targets</i>:</br>
Tarjamahan atawa sasaruaan naskah dina basa Sunda:\n\n{output}</br>
2. <b>id_sunda</b></br>
Prompt/instruction language: <i>Sundanese</i></br>
Source/input language: <i>Indonesia</i></br>
Target/output language:<i>Sundanese</i></br>
Size: 94 samples.</br>
Prompt Template:</br>
<i>inputs</i>:</br>
Tarjamahkeun teks dongeng barudak di handap tina teks basa Indonesia kana teks basa Sunda:\n\n{input}\n\n</br>
<i>targets</i>:</br>
Tarjamahan atawa sasaruaan naskah dina basa Sunda:\n\n{output}</br>
3. <b>javanese_sunda</b></br>
Prompt/instruction language: <i>Sundanese</i></br>
Source/input language: <i>Javanese</i></br>
Target/output language: <i>Sundanese</i></br>
Size: 20 samples.</br>
Prompt Template:</br>
<i>inputs</i>:</br>
Tarjamahkeun teks dongeng barudak di handap tina teks basa Jawa kana teks basa Sunda:\n\n{input}\n\n</br>
<i>targets</i>:</br>
Tarjamahan atawa sasaruaan naskah dina basa Sunda:\n\n{output}</br>
Data was originally prepared for enriching multilingual resources in Open Science AYA Project (2023). |
CorpuSlave/expertise_1 | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: doc_id
dtype: string
splits:
- name: train
num_bytes: 8027905291
num_examples: 1379830
download_size: 3070150673
dataset_size: 8027905291
---
|
tomaarsen/setfit-absa-semeval-restaurants | ---
dataset_info:
features:
- name: text
dtype: string
- name: span
dtype: string
- name: label
dtype: string
- name: ordinal
dtype: int64
splits:
- name: train
num_bytes: 490223
num_examples: 3693
- name: test
num_bytes: 138187
num_examples: 1134
download_size: 193352
dataset_size: 628410
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "tomaarsen/setfit-absa-semeval-restaurants"
### Dataset Summary
This dataset contains the manually annotated restaurant reviews from SemEval-2014 Task 4, in the format as
understood by [SetFit](https://github.com/huggingface/setfit) ABSA.
For more details, see https://aclanthology.org/S14-2004/
### Data Instances
An example of "train" looks as follows.
```json
{"text": "But the staff was so horrible to us.", "span": "staff", "label": "negative", "ordinal": 0}
{"text": "To be completely fair, the only redeeming factor was the food, which was above average, but couldn't make up for all the other deficiencies of Teodora.", "span": "food", "label": "positive", "ordinal": 0}
{"text": "The food is uniformly exceptional, with a very capable kitchen which will proudly whip up whatever you feel like eating, whether it's on the menu or not.", "span": "food", "label": "positive", "ordinal": 0}
{"text": "The food is uniformly exceptional, with a very capable kitchen which will proudly whip up whatever you feel like eating, whether it's on the menu or not.", "span": "kitchen", "label": "positive", "ordinal": 0}
{"text": "The food is uniformly exceptional, with a very capable kitchen which will proudly whip up whatever you feel like eating, whether it's on the menu or not.", "span": "menu", "label": "neutral", "ordinal": 0}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
- `span`: a `string` feature showing the aspect span from the text.
- `label`: a `string` feature showing the polarity of the aspect span.
- `ordinal`: an `int64` feature showing the n-th occurrence of the span in the text. This is useful for if the span occurs within the same text multiple times.
### Data Splits
| name |train|test|
|---------|----:|---:|
|tomaarsen/setfit-absa-semeval-restaurants|3693|1134|
### Training ABSA models using SetFit ABSA
To train using this dataset, first install the SetFit library:
```bash
pip install setfit
```
And then you can use the following script as a guideline of how to train an ABSA model on this dataset:
```python
from setfit import AbsaModel, AbsaTrainer, TrainingArguments
from datasets import load_dataset
from transformers import EarlyStoppingCallback
# You can initialize a AbsaModel using one or two SentenceTransformer models, or two ABSA models
model = AbsaModel.from_pretrained("sentence-transformers/all-MiniLM-L6-v2")
# The training/eval dataset must have `text`, `span`, `polarity`, and `ordinal` columns
dataset = load_dataset("tomaarsen/setfit-absa-semeval-restaurants")
train_dataset = dataset["train"]
eval_dataset = dataset["test"]
args = TrainingArguments(
output_dir="models",
use_amp=True,
batch_size=256,
eval_steps=50,
save_steps=50,
load_best_model_at_end=True,
)
trainer = AbsaTrainer(
model,
args=args,
train_dataset=train_dataset,
eval_dataset=eval_dataset,
callbacks=[EarlyStoppingCallback(early_stopping_patience=5)],
)
trainer.train()
metrics = trainer.evaluate(eval_dataset)
print(metrics)
trainer.push_to_hub("tomaarsen/setfit-absa-restaurants")
```
You can then run inference like so:
```python
from setfit import AbsaModel
# Download from Hub and run inference
model = AbsaModel.from_pretrained(
"tomaarsen/setfit-absa-restaurants-aspect",
"tomaarsen/setfit-absa-restaurants-polarity",
)
# Run inference
preds = model([
"The best pizza outside of Italy and really tasty.",
"The food here is great but the service is terrible",
])
```
### Citation Information
```bibtex
@inproceedings{pontiki-etal-2014-semeval,
title = "{S}em{E}val-2014 Task 4: Aspect Based Sentiment Analysis",
author = "Pontiki, Maria and
Galanis, Dimitris and
Pavlopoulos, John and
Papageorgiou, Harris and
Androutsopoulos, Ion and
Manandhar, Suresh",
editor = "Nakov, Preslav and
Zesch, Torsten",
booktitle = "Proceedings of the 8th International Workshop on Semantic Evaluation ({S}em{E}val 2014)",
month = aug,
year = "2014",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/S14-2004",
doi = "10.3115/v1/S14-2004",
pages = "27--35",
}
```
|
open-llm-leaderboard/details_mlabonne__GML-Mistral-merged-v1 | ---
pretty_name: Evaluation run of mlabonne/GML-Mistral-merged-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/GML-Mistral-merged-v1](https://huggingface.co/mlabonne/GML-Mistral-merged-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__GML-Mistral-merged-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T10:46:26.609299](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__GML-Mistral-merged-v1/blob/main/results_2023-12-29T10-46-26.609299.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6238015653000757,\n\
\ \"acc_stderr\": 0.03193933734157293,\n \"acc_norm\": 0.6367462570659438,\n\
\ \"acc_norm_stderr\": 0.032818108624919226,\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.5157810759126963,\n\
\ \"mc2_stderr\": 0.01645251270460575\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3993174061433447,\n \"acc_stderr\": 0.0143120945579467,\n\
\ \"acc_norm\": 0.4377133105802048,\n \"acc_norm_stderr\": 0.014497573881108294\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3623780123481378,\n\
\ \"acc_stderr\": 0.0047970481548939665,\n \"acc_norm\": 0.578868751244772,\n\
\ \"acc_norm_stderr\": 0.0049273147294335564\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.01358661921990334,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.01358661921990334\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n\
\ \"acc_stderr\": 0.01630389953079613,\n \"acc_norm\": 0.3888268156424581,\n\
\ \"acc_norm_stderr\": 0.01630389953079613\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"\
acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.01274307294265335,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.01274307294265335\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n\
\ \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.5157810759126963,\n\
\ \"mc2_stderr\": 0.01645251270460575\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.0123469148634153\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/GML-Mistral-merged-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|arc:challenge|25_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|gsm8k|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hellaswag|10_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-46-26.609299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T10-46-26.609299.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- '**/details_harness|winogrande|5_2023-12-29T10-46-26.609299.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T10-46-26.609299.parquet'
- config_name: results
data_files:
- split: 2023_12_29T10_46_26.609299
path:
- results_2023-12-29T10-46-26.609299.parquet
- split: latest
path:
- results_2023-12-29T10-46-26.609299.parquet
---
# Dataset Card for Evaluation run of mlabonne/GML-Mistral-merged-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/GML-Mistral-merged-v1](https://huggingface.co/mlabonne/GML-Mistral-merged-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__GML-Mistral-merged-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T10:46:26.609299](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__GML-Mistral-merged-v1/blob/main/results_2023-12-29T10-46-26.609299.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6238015653000757,
"acc_stderr": 0.03193933734157293,
"acc_norm": 0.6367462570659438,
"acc_norm_stderr": 0.032818108624919226,
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148125,
"mc2": 0.5157810759126963,
"mc2_stderr": 0.01645251270460575
},
"harness|arc:challenge|25": {
"acc": 0.3993174061433447,
"acc_stderr": 0.0143120945579467,
"acc_norm": 0.4377133105802048,
"acc_norm_stderr": 0.014497573881108294
},
"harness|hellaswag|10": {
"acc": 0.3623780123481378,
"acc_stderr": 0.0047970481548939665,
"acc_norm": 0.578868751244772,
"acc_norm_stderr": 0.0049273147294335564
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990334,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990334
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.01630389953079613,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.01630389953079613
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265335,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265335
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24969400244798043,
"mc1_stderr": 0.015152286907148125,
"mc2": 0.5157810759126963,
"mc2_stderr": 0.01645251270460575
},
"harness|winogrande|5": {
"acc": 0.7387529597474349,
"acc_stderr": 0.0123469148634153
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/anastasia_hoshin_rezero | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of anastasia_hoshin (Re:Zero Kara Hajimeru Isekai Seikatsu)
This is the dataset of anastasia_hoshin (Re:Zero Kara Hajimeru Isekai Seikatsu), containing 56 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
wefussell/amasum-final-test | ---
license: mit
---
|
thomasavare/waste-classification-audio-deepl | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: speaker
dtype: string
- name: transcription
dtype: string
- name: translation
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: train
num_bytes: 397554225.0
num_examples: 500
download_size: 300753479
dataset_size: 397554225.0
---
# Dataset Card for "waste-classification-audio-deepl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marasaki/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 17110932
num_examples: 3000
download_size: 5364770
dataset_size: 17110932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: unknown
task_categories:
- text-classification
- token-classification
language:
- en
pretty_name: GitHub Issues Small
size_categories:
- 1K<n<10K
---
# Dataset Card for "github-issues"
[Hugging Face NLP Course 5章5節]https://huggingface.co/learn/nlp-course/chapter5/5?fw=pt)で作ったGitHubのイシューをスクレイピングしたデータセットです。
|
jlbaker361/wikiart-balanced500 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: style
dtype: string
- name: name
dtype: string
- name: gen_style
dtype: string
splits:
- name: train
num_bytes: 457748372.7
num_examples: 12150
- name: test
num_bytes: 51892420.75
num_examples: 1350
download_size: 512269037
dataset_size: 509640793.45
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
TiagoJacobs/lamma2-bigbluebutton | ---
license: apache-2.0
---
|
DamarJati/GreenLabel-Waste-Types | ---
viewer: true
task_categories:
- text-classification
language:
- en
pretty_name: GreenLabel-Waste-Types
size_categories:
- 10K<n<100K
---
Original Datasets: https://www.kaggle.com/datasets/techsash/waste-classification-data?select=DATASET |
Intuit-GenSRF/sexting-nsfw-adultconten-es | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
- name: processed_text
sequence: string
- name: text_es
dtype: string
splits:
- name: train
num_bytes: 89678
num_examples: 538
download_size: 0
dataset_size: 89678
---
# Dataset Card for "sexting-nsfw-adultconten-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Slapping/FlowCharts | ---
license: mit
---
|
TREC-AToMiC/AToMiC-Images-v0.1 | ---
license: other
dataset_info:
features:
- name: image_id
dtype: string
- name: image_url
dtype: string
- name: caption_reference_description
sequence: string
- name: caption_attribution_description
sequence: string
- name: caption_alt_text_description
sequence: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 56792080050.0
num_examples: 3723512
- name: validation
num_bytes: 458681067.375
num_examples: 30365
- name: test
num_bytes: 316081601.5
num_examples: 20732
download_size: 57119923069
dataset_size: 57566842718.875
---
## Licensing Information
In exchange for permission to use the AToMiC database (the "Database") at TREC-AToMiC, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and educational purposes.
2. TREC-AToMiC makes no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
3. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the TREC-AToMiC team including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.
4. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
5. TREC-AToMiC reserve the right to terminate Researcher's access to the Database at any time.
6. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
|
mikkaelmoura/myllakarvalho1 | ---
license: openrail
---
|
CyberHarem/tilty_claret_tenseioujototensaireijounomahoukakumei | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tilty Claret
This is the dataset of Tilty Claret, containing 147 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 147 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 288 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 147 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 147 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 147 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 147 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 147 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 288 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 288 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 288 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
vilm/viet-pretrained-002 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4502549932
num_examples: 258823
download_size: 2324002719
dataset_size: 4502549932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "viet-pretrained-002"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_haoranxu__ALMA-7B | ---
pretty_name: Evaluation run of haoranxu/ALMA-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [haoranxu/ALMA-7B](https://huggingface.co/haoranxu/ALMA-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_haoranxu__ALMA-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T14:49:25.025957](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-7B/blob/main/results_2023-12-09T14-49-25.025957.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3842750602900277,\n\
\ \"acc_stderr\": 0.0337414380010653,\n \"acc_norm\": 0.388832150301939,\n\
\ \"acc_norm_stderr\": 0.03466152745310896,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731613,\n \"mc2\": 0.3564384771875291,\n\
\ \"mc2_stderr\": 0.013567943486529975\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47013651877133106,\n \"acc_stderr\": 0.0145853058400071,\n\
\ \"acc_norm\": 0.5034129692832765,\n \"acc_norm_stderr\": 0.014611050403244081\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5642302330213105,\n\
\ \"acc_stderr\": 0.004948439229523914,\n \"acc_norm\": 0.7550288787094205,\n\
\ \"acc_norm_stderr\": 0.004291911350430712\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37358490566037733,\n \"acc_stderr\": 0.02977308271331987,\n\
\ \"acc_norm\": 0.37358490566037733,\n \"acc_norm_stderr\": 0.02977308271331987\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04082482904638629,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n\
\ \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n\
\ \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.035122074123020514,\n\
\ \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.035122074123020514\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.36129032258064514,\n \"acc_stderr\": 0.027327548447957543,\n\
\ \"acc_norm\": 0.36129032258064514,\n \"acc_norm_stderr\": 0.027327548447957543\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33497536945812806,\n \"acc_stderr\": 0.0332085274234831,\n \"\
acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.0332085274234831\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.03872592983524753,\n\
\ \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.03872592983524753\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.398989898989899,\n \"acc_stderr\": 0.03488901616852731,\n \"acc_norm\"\
: 0.398989898989899,\n \"acc_norm_stderr\": 0.03488901616852731\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.48186528497409326,\n \"acc_stderr\": 0.036060650018329185,\n\
\ \"acc_norm\": 0.48186528497409326,\n \"acc_norm_stderr\": 0.036060650018329185\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941187,\n\
\ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941187\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.48440366972477067,\n \"acc_stderr\": 0.02142689153920805,\n \"\
acc_norm\": 0.48440366972477067,\n \"acc_norm_stderr\": 0.02142689153920805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4362745098039216,\n \"acc_stderr\": 0.03480693138457039,\n \"\
acc_norm\": 0.4362745098039216,\n \"acc_norm_stderr\": 0.03480693138457039\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.48523206751054854,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.48523206751054854,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.03255326307272486,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.03255326307272486\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5197956577266922,\n\
\ \"acc_stderr\": 0.017865944827291626,\n \"acc_norm\": 0.5197956577266922,\n\
\ \"acc_norm_stderr\": 0.017865944827291626\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.02648339204209818,\n\
\ \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.02648339204209818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38562091503267976,\n \"acc_stderr\": 0.027870745278290313,\n\
\ \"acc_norm\": 0.38562091503267976,\n \"acc_norm_stderr\": 0.027870745278290313\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n\
\ \"acc_stderr\": 0.02839442137098453,\n \"acc_norm\": 0.5080385852090032,\n\
\ \"acc_norm_stderr\": 0.02839442137098453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422704,\n\
\ \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422704\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29465449804432853,\n\
\ \"acc_stderr\": 0.011643576764069548,\n \"acc_norm\": 0.29465449804432853,\n\
\ \"acc_norm_stderr\": 0.011643576764069548\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933112,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933112\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3937908496732026,\n \"acc_stderr\": 0.019766211991073056,\n \
\ \"acc_norm\": 0.3937908496732026,\n \"acc_norm_stderr\": 0.019766211991073056\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.30612244897959184,\n \"acc_stderr\": 0.02950489645459596,\n\
\ \"acc_norm\": 0.30612244897959184,\n \"acc_norm_stderr\": 0.02950489645459596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4527363184079602,\n\
\ \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.4527363184079602,\n\
\ \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288085,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288085\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5847953216374269,\n \"acc_stderr\": 0.037792759455032014,\n\
\ \"acc_norm\": 0.5847953216374269,\n \"acc_norm_stderr\": 0.037792759455032014\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731613,\n \"mc2\": 0.3564384771875291,\n\
\ \"mc2_stderr\": 0.013567943486529975\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7237569060773481,\n \"acc_stderr\": 0.012566815015698157\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/haoranxu/ALMA-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|arc:challenge|25_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|gsm8k|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hellaswag|10_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T14-49-25.025957.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- '**/details_harness|winogrande|5_2023-12-09T14-49-25.025957.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T14-49-25.025957.parquet'
- config_name: results
data_files:
- split: 2023_12_09T14_49_25.025957
path:
- results_2023-12-09T14-49-25.025957.parquet
- split: latest
path:
- results_2023-12-09T14-49-25.025957.parquet
---
# Dataset Card for Evaluation run of haoranxu/ALMA-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/haoranxu/ALMA-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [haoranxu/ALMA-7B](https://huggingface.co/haoranxu/ALMA-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_haoranxu__ALMA-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T14:49:25.025957](https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-7B/blob/main/results_2023-12-09T14-49-25.025957.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3842750602900277,
"acc_stderr": 0.0337414380010653,
"acc_norm": 0.388832150301939,
"acc_norm_stderr": 0.03466152745310896,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731613,
"mc2": 0.3564384771875291,
"mc2_stderr": 0.013567943486529975
},
"harness|arc:challenge|25": {
"acc": 0.47013651877133106,
"acc_stderr": 0.0145853058400071,
"acc_norm": 0.5034129692832765,
"acc_norm_stderr": 0.014611050403244081
},
"harness|hellaswag|10": {
"acc": 0.5642302330213105,
"acc_stderr": 0.004948439229523914,
"acc_norm": 0.7550288787094205,
"acc_norm_stderr": 0.004291911350430712
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3618421052631579,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.3618421052631579,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37358490566037733,
"acc_stderr": 0.02977308271331987,
"acc_norm": 0.37358490566037733,
"acc_norm_stderr": 0.02977308271331987
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4,
"acc_stderr": 0.04082482904638629,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04082482904638629
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36129032258064514,
"acc_stderr": 0.027327548447957543,
"acc_norm": 0.36129032258064514,
"acc_norm_stderr": 0.027327548447957543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.0332085274234831,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.0332085274234831
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.03872592983524753,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.03872592983524753
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.398989898989899,
"acc_stderr": 0.03488901616852731,
"acc_norm": 0.398989898989899,
"acc_norm_stderr": 0.03488901616852731
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.48186528497409326,
"acc_stderr": 0.036060650018329185,
"acc_norm": 0.48186528497409326,
"acc_norm_stderr": 0.036060650018329185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941187,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941187
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230186,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230186
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48440366972477067,
"acc_stderr": 0.02142689153920805,
"acc_norm": 0.48440366972477067,
"acc_norm_stderr": 0.02142689153920805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4362745098039216,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.4362745098039216,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.48523206751054854,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.48523206751054854,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319772,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319772
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.42718446601941745,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.42718446601941745,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03255326307272486,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03255326307272486
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5197956577266922,
"acc_stderr": 0.017865944827291626,
"acc_norm": 0.5197956577266922,
"acc_norm_stderr": 0.017865944827291626
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38562091503267976,
"acc_stderr": 0.027870745278290313,
"acc_norm": 0.38562091503267976,
"acc_norm_stderr": 0.027870745278290313
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.02839442137098453,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.02839442137098453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422704,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29465449804432853,
"acc_stderr": 0.011643576764069548,
"acc_norm": 0.29465449804432853,
"acc_norm_stderr": 0.011643576764069548
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933112,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933112
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3937908496732026,
"acc_stderr": 0.019766211991073056,
"acc_norm": 0.3937908496732026,
"acc_norm_stderr": 0.019766211991073056
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.30612244897959184,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.30612244897959184,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4527363184079602,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.4527363184079602,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288085,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288085
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5847953216374269,
"acc_stderr": 0.037792759455032014,
"acc_norm": 0.5847953216374269,
"acc_norm_stderr": 0.037792759455032014
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731613,
"mc2": 0.3564384771875291,
"mc2_stderr": 0.013567943486529975
},
"harness|winogrande|5": {
"acc": 0.7237569060773481,
"acc_stderr": 0.012566815015698157
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_unified | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 11288801
num_examples: 18611
download_size: 5543097
dataset_size: 11288801
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_unified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ppdev/medtext-llama2 | ---
license: cc-by-4.0
---
Original data from:
https://huggingface.co/datasets/BI55/MedText
I just reformat it for fine tunning in lamma2 based on this article https://mlabonne.github.io/blog/posts/Fine_Tune_Your_Own_Llama_2_Model_in_a_Colab_Notebook.html
Another important point related to the data quality is the prompt template. Prompts are comprised of similar elements: system prompt (optional) to guide the model, user prompt (required) to give the instruction, additional inputs (optional) to take into consideration, and the model’s answer (required). In the case of Llama 2, the authors used the following template for the chat models:
[INST]
User prompt [/INST] Model answer |
nsanghi/axterix-obelix | ---
license: apache-2.0
task_categories:
- image-to-image
language:
- en
tags:
- asterix
- diffusion
- dreambooth
size_categories:
- n<1K
---
This dataset contains 10 images of asterix and Obelix cartoon characters taken from internet
|
cartesinus/leyzer-fedcsis-translated | ---
license: cc-by-4.0
task_categories:
- text-classification
language:
- pl
tags:
- natural-language-understanding
size_categories:
- 10K<n<100K
---
# Leyzer: A Dataset for Multilingual Virtual Assistants
Leyzer is a multilingual text corpus designed to study multilingual and cross-lingual natural language understanding (NLU) models and the strategies of localization of
virtual assistants. It consists of 20 domains across three languages: English, Spanish and Polish, with 186 intents and a wide range of samples, ranging from 1 to 672
sentences per intent. For more stats please refer to wiki.
|
mstz/arhythmia | ---
language:
- en
tags:
- arrhythmia
- tabular_classification
- multiclass_classification
- binary_classification
- UCI
pretty_name: Arhythmia
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- arhytmia
- has_arhytmia
license: cc
---
# Arhythmia
The [Arrhythmia dataset](https://archive.ics.uci.edu/ml/datasets/Arrhythmia) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Does the patient have arhythmia? If so, what type?
# Configurations and tasks
| **Configuration** | **Task** | Description |
|-------------------|---------------------------|---------------------------------------------------------------|
| arhytmia | Multiclass classification | What type of arhythmia does the patient have? |
| has_arhytmia | Binary classification | Does the patient have arhythmia? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/arhythmia", "arhythmia")["train"]
```
# Features
Target feature changes according to the selected configuration and is always in last position in the dataset. |
tyzhu/find_sent_before_sent_train_400_eval_40_baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 3760579
num_examples: 1994
- name: validation
num_bytes: 386422
num_examples: 200
download_size: 791972
dataset_size: 4147001
---
# Dataset Card for "find_sent_before_sent_train_400_eval_40_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andstor/output | ---
license: mit
task_categories:
- text-generation
language:
- en
dataset_info:
- config_name: gpt2-xl
features:
- name: id
dtype: string
- name: part
sequence: int32
- name: prompt
dtype: string
- name: reference
dtype: string
- name: prediction
dtype: string
- name: ended
dtype: bool
- name: meta
struct:
- name: subset
dtype: string
splits:
- name: andstor.the_pile_github.greedy
num_bytes: 60221138
num_examples: 22169
download_size: 66419674
dataset_size: 60221138
- config_name: EleutherAI.gpt-j-6B
features:
- name: id
dtype: string
- name: part
sequence: int32
- name: prompt
dtype: string
- name: reference
dtype: string
- name: prediction
dtype: string
- name: ended
dtype: bool
- name: meta
struct:
- name: subset
dtype: string
splits:
- name: andstor.the_pile_github.greedy
num_bytes: 67625587
num_examples: 20665
download_size: 73049509
dataset_size: 67625587
- config_name: NinedayWang.PolyCoder-2.7B
features:
- name: id
dtype: string
- name: part
sequence: int32
- name: prompt
dtype: string
- name: reference
dtype: string
- name: prediction
dtype: string
- name: ended
dtype: bool
- name: meta
struct:
- name: subset
dtype: string
splits:
- name: andstor.the_pile_github.greedy
num_bytes: 58822858
num_examples: 20342
download_size: 63717236
dataset_size: 58822858
- config_name: Salesforce.codegen-16B-multi
features:
- name: id
dtype: string
- name: part
sequence: int32
- name: prompt
dtype: string
- name: reference
dtype: string
- name: prediction
dtype: string
- name: ended
dtype: bool
- name: meta
struct:
- name: subset
dtype: string
splits:
- name: THUDM.humaneval_x.greedy
num_bytes: 2509745
num_examples: 820
download_size: 2694784
dataset_size: 2509745
- config_name: openai.gpt-3.5-turbo-0613
features:
- name: id
dtype: string
- name: part
sequence: int32
- name: prompt
dtype: string
- name: reference
dtype: string
- name: prediction
dtype: string
- name: ended
dtype: bool
- name: meta
struct:
- name: subset
dtype: string
splits:
- name: THUDM.humaneval_x.greedy
num_bytes: 958178
num_examples: 820
download_size: 1067958
dataset_size: 958178
- config_name: openai.gpt-4-0613
features:
- name: id
dtype: string
- name: part
sequence: int32
- name: prompt
dtype: string
- name: reference
dtype: string
- name: prediction
dtype: string
- name: ended
dtype: bool
- name: meta
struct:
- name: subset
dtype: string
splits:
- name: THUDM.humaneval_x.greedy
num_bytes: 875401
num_examples: 820
- name: THUDM.humaneval_x.random
num_bytes: 906274
num_examples: 820
download_size: 1995455
dataset_size: 1781675
---
# Dataset Card for Output
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/andstor/lm-output-dataset
- **Repository:** https://github.com/andstor/lm-output-dataset
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [André Storhaug](mailto:andr3.storhaug@gmail.com)
### Dataset Summary
This is a dataset of various language model outputs from different datasets.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@andstor](https://github.com/andstor) for adding this dataset.
|
liangyupu/DoTA_dataset | ---
license: mit
language:
- zh
- en
- fr
- de
size_categories:
- 100K<n<1M
---
# Document Image Machine Translation with Dynamic Multi-pre-trained Models Assembling
This is the official repository for **DoTA** dataset (**Do**cument image machine **T**ranslation dataset of **A**rXiv articles in markdown format) introduced by the following paper: [***Document Image Machine Translation with Dynamic Multi-pre-trained Models Assembling (NAACL 2024 Main)***](https://openreview.net/forum?id=XH2TgKlXWv)
In addition to the 126K samples mentioned in the paper, we provide all 139K samples that have not been filtered.
Each sample contains original English image, transcripted English mmd file and translated Chinese/French/German mmd file.
Samples used in the paper are listed in a json file.
Text files can be decompressed as follows:
```bash
tar -xzvf zh_mmd.tar.gz -C ./
```
Image files can be decompressed as follows:
```bash
cat imgs.tar.gz.* | tar -xzvf - -C ./
```
If you want to use our dataset, please cite as follows:
```BibTex
@inproceedings{
liang2024document,
title={Document Image Machine Translation with Dynamic Multi-pre-trained Models Assembling},
author={Yupu Liang and Yaping Zhang and Cong MA and Zhiyang Zhang and Yang Zhao and Lu Xiang and Chengqing Zong and Yu Zhou},
booktitle={2024 Annual Conference of the North American Chapter of the Association for Computational Linguistics},
year={2024},
url={https://openreview.net/forum?id=XH2TgKlXWv}
}
``` |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-88000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1113664
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
claudios/cubert_ETHPy150Open | ---
license: apache-2.0
task_categories:
- text-classification
pretty_name: CuBERT ETH Py150 Benchmarks
arxiv: 2001.00059
dataset_info:
- config_name: exception_datasets
features:
- name: function
dtype: string
- name: label
dtype: string
- name: info
dtype: string
splits:
- name: train
num_bytes: 25423003
num_examples: 18480
- name: dev
num_bytes: 2845822
num_examples: 2088
- name: test
num_bytes: 14064500
num_examples: 10348
download_size: 16935273
dataset_size: 42333325
- config_name: function_docstring_datasets
features:
- name: function
dtype: string
- name: docstring
dtype: string
- name: label
dtype: string
- name: info
dtype: string
splits:
- name: train
num_bytes: 261700491
num_examples: 340846
- name: dev
num_bytes: 28498757
num_examples: 37592
- name: test
num_bytes: 141660242
num_examples: 186698
download_size: 121724722
dataset_size: 431859490
- config_name: swapped_operands_datasets
features:
- name: function
dtype: string
- name: label
dtype: string
- name: info
dtype: string
splits:
- name: train
num_bytes: 271097336
num_examples: 236246
- name: dev
num_bytes: 29986397
num_examples: 26118
- name: test
num_bytes: 148544957
num_examples: 130972
download_size: 105243573
dataset_size: 449628690
- config_name: variable_misuse_datasets
features:
- name: function
dtype: string
- name: label
dtype: string
- name: info
dtype: string
splits:
- name: train
num_bytes: 474283355
num_examples: 700708
- name: dev
num_bytes: 50447683
num_examples: 75478
- name: test
num_bytes: 251591448
num_examples: 378440
download_size: 231302039
dataset_size: 776322486
- config_name: variable_misuse_repair_datasets
features:
- name: function
sequence: string
- name: target_mask
sequence: int64
- name: error_location_mask
sequence: int64
- name: candidate_mask
sequence: int64
- name: provenance
dtype: string
splits:
- name: train
num_bytes: 4417505142
num_examples: 700708
- name: dev
num_bytes: 469436314
num_examples: 75478
- name: test
num_bytes: 2331355329
num_examples: 378440
download_size: 498300512
dataset_size: 7218296785
- config_name: wrong_binary_operator_datasets
features:
- name: function
dtype: string
- name: label
dtype: string
- name: info
dtype: string
splits:
- name: train
num_bytes: 439948844
num_examples: 459400
- name: dev
num_bytes: 47620848
num_examples: 49804
- name: test
num_bytes: 239409450
num_examples: 251804
download_size: 163088211
dataset_size: 726979142
configs:
- config_name: exception_datasets
data_files:
- split: train
path: exception_datasets/train-*
- split: dev
path: exception_datasets/dev-*
- split: test
path: exception_datasets/test-*
- config_name: function_docstring_datasets
data_files:
- split: train
path: function_docstring_datasets/train-*
- split: dev
path: function_docstring_datasets/dev-*
- split: test
path: function_docstring_datasets/test-*
- config_name: swapped_operands_datasets
data_files:
- split: train
path: swapped_operands_datasets/train-*
- split: dev
path: swapped_operands_datasets/dev-*
- split: test
path: swapped_operands_datasets/test-*
- config_name: variable_misuse_datasets
data_files:
- split: train
path: variable_misuse_datasets/train-*
- split: dev
path: variable_misuse_datasets/dev-*
- split: test
path: variable_misuse_datasets/test-*
- config_name: variable_misuse_repair_datasets
data_files:
- split: train
path: variable_misuse_repair_datasets/train-*
- split: dev
path: variable_misuse_repair_datasets/dev-*
- split: test
path: variable_misuse_repair_datasets/test-*
- config_name: wrong_binary_operator_datasets
data_files:
- split: train
path: wrong_binary_operator_datasets/train-*
- split: dev
path: wrong_binary_operator_datasets/dev-*
- split: test
path: wrong_binary_operator_datasets/test-*
tags:
- code
---
# CuBERT ETH150 Open Benchmarks
This is an unofficial HuggingFace upload of the [CuBERT ETH150 Open Benchmarks](https://github.com/google-research/google-research/tree/master/cubert). This dataset was released along with [Learning and Evaluating Contextual Embedding of Source Code](https://arxiv.org/abs/2001.00059).
---
## Benchmarks and Fine-Tuned Models
Here we describe the 6 Python benchmarks we created. All 6 benchmarks were derived from [ETH Py150 Open](https://github.com/google-research-datasets/eth_py150_open). All examples are stored as sharded text files. Each text line corresponds to a separate example encoded as a JSON object. For each dataset, we release separate training/validation/testing splits along the same boundaries that ETH Py150 Open splits its files to the corresponding splits. The fine-tuned models are the checkpoints of each model with the highest validation accuracy.
1. **Function-docstring classification**. Combinations of functions with their correct or incorrect documentation string, used to train a classifier that can tell which pairs go together. The JSON fields are:
* `function`: string, the source code of a function as text
* `docstring`: string, the documentation string for that function. Note that the string is unquoted. To be able to properly tokenize it with the CuBERT tokenizers, you have to wrap it in quotes first. For example, in Python, use `string_to_tokenize = f'"""{docstring}"""'`.
* `label`: string, one of (“Incorrect”, “Correct”), the label of the example.
* `info`: string, an unformatted description of how the example was constructed, including the source dataset (always “ETHPy150Open”), the repository and filepath, the function name and, for “Incorrect” examples, the function whose docstring was substituted.
1. **Exception classification**. Combinations of functions where one exception type has been masked, along with a label indicating the masked exception type. The JSON fields are:
* `function`: string, the source code of a function as text, in which one exception type has been replaced with the special token “__HOLE__”
* `label`: string, one of (`ValueError`, `KeyError`, `AttributeError`, `TypeError`, `OSError`, `IOError`, `ImportError`, `IndexError`, `DoesNotExist`, `KeyboardInterrupt`, `StopIteration`, `AssertionError`, `SystemExit`, `RuntimeError`, `HTTPError`, `UnicodeDecodeError`, `NotImplementedError`, `ValidationError`, `ObjectDoesNotExist`, `NameError`, `None`), the masked exception type. Note that `None` never occurs in the data and will be removed in a future release.
* `info`: string, an unformatted description of how the example was constructed, including the source dataset (always “ETHPy150Open”), the repository and filepath, and the fully-qualified function name.
1. **Variable-misuse classification**. Combinations of functions where one use of a variable may have been replaced with another variable defined in the same context, along with a label indicating if this bug-injection has occurred. The JSON fields are:
* `function`: string, the source code of a function as text.
* `label`: string, one of (“Correct”, “Variable misuse”) indicating if this is a buggy or bug-free example.
* `info`: string, an unformatted description of how the example was constructed, including the source dataset (always “ETHPy150Open”), the repository and filepath, the function, and whether the example is bugfree (marked “original”) or the variable substitution that has occurred (e.g., “correct_variable” → “incorrect_variable”).
1. **Swapped-operand classification**. Combinations of functions where one use binary operator’s arguments have been swapped, to create a buggy example, or left undisturbed, along with a label indicating if this bug-injection has occurred. The JSON fields are:
* `function`: string, the source code of a function as text.
* `label`: string, one of (“Correct”, “Swapped operands”) indicating if this is a buggy or bug-free example.
* `info`: string, an unformatted description of how the example was constructed, including the source dataset (always “ETHPy150Open”), the repository and filepath, the function, and whether the example is bugfree (marked “original”) or the operand swap has occurred (e.g., “swapped operands of `not in`”).
1. **Wrong-binary-operator classification**. Combinations of functions where one binary operator has been swapped with another, to create a buggy example, or left undisturbed, along with a label indicating if this bug-injection has occurred. The JSON fields are:
* `function`: string, the source code of a function as text.
* `label`: string, one of (“Correct”, “Wrong binary operator”) indicating if this is a buggy or bug-free example.
* `info`: string, an unformatted description of how the example was constructed, including the source dataset (always “ETHPy150Open”), the repository and filepath, the function, and whether the example is bugfree (marked “original”) or the operator replacement has occurred (e.g., “`==`-> `!=`”).
1. **Variable-misuse localization and repair**. Combinations of functions where one use of a variable may have been replaced with another variable defined in the same context, along with information that can be used to localize and repair the bug, as well as the location of the bug if such a bug exists. The JSON fields are:
* `function`: a list of strings, the source code of a function, tokenized with the vocabulary from item b. Note that, unlike other task datasets, this dataset gives a tokenized function, rather than the code as a single string.
* `target_mask`: a list of integers (0 or 1). If the integer at some position is 1, then the token at the corresponding position of the function token list is a correct repair for the introduced bug. If a variable has been split into multiple tokens, only the first subtoken is marked in this mask. If the example is bug-free, all integers are 0.
* `error_location_mask`: a list of integers (0 or 1). If the integer at some position is 1, then there is a variable-misuse bug at the corresponding location of the tokenized function. In a bug-free example, the first integer is 1. There is exactly one integer set to 1 for all examples. If a variable has been split into multiple tokens, only the first subtoken is marked in this mask.
* `candidate_mask`: a list of integers (0 or 1). If the integer at some position is 1, then the variable starting at that position in the tokenized function is a candidate to consider when repairing a bug. Candidates are all variables defined in the function parameters or via variable declarations in the function. If a variable has been split into multiple tokens, only the first subtoken is marked in this mask, for each candidate.
* `provenance`: string, an unformatted description of how the example was constructed, including the source dataset (always “ETHPy150Open”), the repository and filepath, the function, and whether the example is bugfree (marked “original”) or the buggy/repair token positions and variables (e.g., “16/18 `kwargs` → `self`”). 16 is the position of the introduced error, 18 is the location of the repair.
## Citation
```bibtex
@inproceedings{cubert,
author = {Aditya Kanade and
Petros Maniatis and
Gogul Balakrishnan and
Kensen Shi},
title = {Learning and evaluating contextual embedding of source code},
booktitle = {Proceedings of the 37th International Conference on Machine Learning,
{ICML} 2020, 12-18 July 2020},
series = {Proceedings of Machine Learning Research},
publisher = {{PMLR}},
year = {2020},
}
``` |
thbndi/Mimic4Dataset | ---
tags:
- medical
---
# Dataset Usage
## Description
The Mimic-IV dataset generate data by executing the Pipeline available on https://github.com/healthylaife/MIMIC-IV-Data-Pipeline.
## Function Signature
```python
load_dataset('thbndi/Mimic4Dataset', task, mimic_path=mimic_data, config_path=config_file, encoding=encod, generate_cohort=gen_cohort, val_size=size, cache_dir=cache)
```
## Arguments
1. `task` (string) :
- Description: Specifies the task you want to perform with the dataset.
- Default: "Mortality"
- Note: Possible Values : 'Phenotype', 'Length of Stay', 'Readmission', 'Mortality'
2. `mimic_path` (string) :
- Description: Complete path to the Mimic-IV raw data on user's machine.
- Note: You need to provide the appropriate path where the Mimic-IV data is stored. The path should end with the version of mimic (eg : mimiciv/2.2). Supported version : 2.2 and 1.0 as provided by the authors of the pipeline.
3. `config_path` (string) optionnal :
- Description: Path to the configuration file for the cohort generation choices (more infos in '/config/readme.md').
- Default: Configuration file provided in the 'config' folder.
4. `encoding` (string) optionnal :
- Description: Data encoding option for the features.
- Options: "concat", "aggreg", "tensor", "raw", "text"
- Default: "concat"
- Note: Choose one of the following options for data encoding:
- "concat": Concatenates the one-hot encoded diagnoses, demographic data vector, and dynamic features at each measured time instant, resulting in a high-dimensional feature vector.
- "aggreg": Concatenates the one-hot encoded diagnoses, demographic data vector, and dynamic features, where each item_id is replaced by the average of the measured time instants, resulting in a reduced-dimensional feature vector.
- "tensor": Represents each feature as an 2D array. There are separate arrays for labels, demographic data ('DEMO'), diagnosis ('COND'), medications ('MEDS'), procedures ('PROC'), chart/lab events ('CHART/LAB'), and output events data ('OUT'). Dynamic features are represented as 2D arrays where each row contains values at a specific time instant.
- "raw": Provide cohort from the pipeline without any encoding for custom data processing.
- "text": Represents diagnoses as text suitable for BERT or other similar text-based models.
- For 'concat' and 'aggreg' the composition of the vector is given in './data/dict/"task"/features_aggreg.csv' or './data/dict/"task"/features_concat.csv' file and in 'features_names' column of the dataset.
5. `generate_cohort` (bool) optionnal :
- Description: Determines whether to generate a new cohort from Mimic-IV data.
- Default: True
- Note: Set it to True to generate a cohort, or False to skip cohort generation.
6. `val_size`, 'test_size' (float) optionnal :
- Description: Proportion of the dataset used for validation during training.
- Default: 0.1 for validation size and 0.2 for testing size.
- Note: Can be set to 0.
7. `cache_dir` (string) optionnal :
- Description: Directory where the processed dataset will be cached.
- Note: Providing a cache directory for each encoding type can avoid errors when changing the encoding type.
## Example Usage
```python
import datasets
from datasets import load_dataset
# Example 1: Load dataset with default settings
dataset = load_dataset('thbndi/Mimic4Dataset', task="Mortality", mimic_path="/path/to/mimic_data")
# Example 2: Load dataset with custom settings
dataset = load_dataset('thbndi/Mimic4Dataset', task="Phenotype", mimic_path="/path/to/mimic_data", config_path="/path/to/config_file", encoding="aggreg", generate_cohort=False, val_size=0.2, cache_dir="/path/to/cache_dir")
```
Please note that the provided examples are for illustrative purposes only, and you should adjust the paths and settings based on your actual dataset and specific use case. |
yuan-sf63/chenyu_mask_32 | ---
dataset_info:
features:
- name: feature
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 4349637.991911224
num_examples: 53852
- name: validation
num_bytes: 483329.00808877597
num_examples: 5984
download_size: 0
dataset_size: 4832967.0
---
# Dataset Card for "chenyu_mask_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allegro/klej-nkjp-ner-en | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
- pl
pretty_name: NKPJ-NER translated to English
size_categories:
- n<1K
---
All instances from the `allegro/klej-nkjp-ner` (train, val, test) translated to English with Google Translate API.
Columns:
- `source` - text instance in Polish.
- `target` - text instance in English. |
UnbiasedMoldInspectionsIN/glmra1m4-9 | ---
license: apache-2.0
---
|
communityai/OdiaGenAI___gpt-teacher-roleplay-odia-3k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 6893531.0
num_examples: 3146
download_size: 2480702
dataset_size: 6893531.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ovior/twitter_dataset_1713094312 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2391221
num_examples: 7063
download_size: 1380144
dataset_size: 2391221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Maximax67/Words-CEFR-Dataset | ---
license: mit
task_categories:
- token-classification
language:
- en
pretty_name: Words CEFR Dataset
size_categories:
- 100K<n<1M
tags:
- CEFR
- POS
- NLP
- SQLite
configs:
- config_name: sqlite_db
data_files: word_cefr_minified.db
---
# Words CEFR Dataset
This project focuses on analyzing and categorizing English words based on their CEFR levels (from A1 to C2). I computed CEFR levels for every valid English word and its corresponding part of speech by by considering various factors including the average levels of other parts of speech for the same word, lemma levels, stem levels, as well as lemma, stem, and word frequencies.
This dataset on GitHub: https://github.com/Maximax67/Words-CEFR-Dataset
## Example of usage
To perform a basic word CEFR analysis, execute the **Text-Analizer.ipynb** file. This notebook provides a practical demonstration of how to analyze words and determine their corresponding CEFR levels.
### Demo text (generated by ChatGPT 3.5):
```
In the heart of every forest, a hidden world thrives among the towering trees. Trees, those silent giants, are more than just passive observers of nature's drama; they are active participants in an intricate dance of life.
Did you know that trees communicate with each other? It's not through words or gestures like ours, but rather through a complex network of fungi that connect their roots underground. This network, often called the "wood wide web," allows trees to share nutrients, water, and even warnings about potential threats.
But trees are not just generous benefactors; they are also masters of adaptation. Take the mighty sequoias, for example, towering giants that have stood the test of time for thousands of years. These giants have evolved thick, fire-resistant bark to withstand the frequent wildfires of their native California.
And speaking of longevity, did you know that some trees have been around for centuries, witnessing history unfold? The ancient bristlecone pines of the American West, for instance, can live for over 5,000 years, making them some of the oldest living organisms on Earth.
So the next time you find yourself wandering through a forest, take a moment to appreciate the remarkable world of trees. They may seem like silent spectators, but their lives are full of fascinating stories waiting to be discovered.
```
### Results:
```
NLP: 318 ms
CEFR levels: 3 ms
------------------------------
Text length: 1370
Total tokens: 275
```
```
CEFR statistic (total words):
A1: 136
A2: 37
B1: 27
B2: 11
C1: 2
C2: 7
CEFR statistic (unique words):
A1: 69
A2: 34
B1: 23
B2: 11
C1: 2
C2: 7
```
```
Not found words: 0
```
```
Words with level B2 and higher: 17
mighty JJ 4.00 B2
potential JJ 4.00 B2
bristlecone NN 6.00 C2
living NN 4.00 B2
longevity NN 5.97 C2
california NNP 6.00 C2
benefactors NNS 6.00 C2
fungi NNS 5.19 C1
masters NNS 4.00 B2
observers NNS 4.00 B2
pines NNS 4.00 B2
sequoias NNS 6.00 C2
wildfires NNS 6.00 C2
underground RB 4.00 B2
withstand VB 5.12 C1
evolved VBN 4.00 B2
thrives VBZ 5.86 C2
```
## Data Collection Process
### Making English valid words list
For this project I created a [valid English words list](https://github.com/Maximax67/English-Valid-Words). It includes word, frequency count and word stems along with their associated probabilities of being valid words. For this project I used [valid_words_sorted_by_frequency.csv](https://github.com/Maximax67/English-Valid-Words/blob/main/valid_words_sorted_by_frequency.csv) file, so all words in words table in SQLite database are sorted by frequency.
### Data Processing Steps
The data processing pipeline, implemented in the **Word-CEFR.ipynb** notebook, involves the following steps:
1. **Parsing Google 1-grams Dataset**: Extracting frequency data for valid words part of speech with frequency counts more than 10,000. I used [spaCy](https://spacy.io/) for more accurate part-of-speech (POS) tagging based on the [Penn Treebank Project's list of POS tags](https://www.ling.upenn.edu/courses/Fall_2003/ling001/penn_treebank_pos.html). Additionally, I used [LemmInflect](https://github.com/bjascob/LemmInflect) for obtaining word lemmas.
2. **Parsing CEFR-J Dataset**: I parsed it to get CEFR level for some words based on its POS. In this step I also parsed core usage categories.
3. **Calculation of Average Frequencies for each CEFR level and Interpolation**.
4. **Assigning CEFR Levels**: Determining the CEFR level for each word's POS based on average levels of other POS for the same word, lemma levels, stem levels, as well as lemma, stem and word frequencies.
5. **Database Optimization**: Minimizing database size by consolidating word frequency data from 1900 to 2019 into a single total value. Additionally, calculating the average POS level from all available sources. SQLite database is now optimized and has a reduced size of 20MB. Refer to the **Minify_db.ipynb** file for more details.
## Possible Improvements
1. **Incorporating Additional Datasets**: To obtain more precise data, consider parsing the [Octanove Vocabulary profile](https://github.com/openlanguageprofiles/olp-en-cefrj/blob/master/octanove-vocabulary-profile-c1c2-1.0.csv) dataset, which provides C1 and C2 level vocabulary data. However, please note that this dataset is licensed under the [Creative Commons Attribution-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-sa/4.0/). Also you can parse [World level survey by Zenodo](https://zenodo.org/records/12501) dataset to further enrich the dataset. This dataset, licensed under the [Creative Commons Attribution 4.0 International license](https://creativecommons.org/licenses/by/4.0/legalcode).
2. **Filtering Personal Names and Geographical Entities**: you can improve result accuracy by implementing mechanisms to identify and exclude personal names, countries, cities, and other such entities from displaying CEFR levels. This refinement can help ensure that the analysis focuses solely on linguistic content.
## Dataset files included
1. **word_cefr_minified.db**: SQLite3 database.

2. **csv/words.csv**:
* word_id
* word
* stem_word_id
3. **csv/word_pos.csv**:
* word_pos_id
* word_id
* pos_tag_id
* lemma_word_id
* frequency_count
* level
4. **csv/word_categories**:
* word_pos_id
* category_id
5. **csv/pos_tags.csv**:
* tag_id
* tag
* description
6. **csv/categories.csv**:
* category_id
* category_title
## License
This project is licensed under the MIT License - see the LICENSE file for details.
## Acknowledgments
I would like to acknowledge the contributions of the following resources:
- [Spacy](https://spacy.io/)
- [CEFR-J](https://cefr-j.org/)
- [LemmInflect](https://github.com/bjascob/LemmInflect)
- [The Google Books Ngram Viewer (used 1-grams dataset, version 20200217)](https://books.google.com/ngrams/)
- [List of pos tags form Penn Treebank Project](https://www.ling.upenn.edu/courses/Fall_2003/ling001/penn_treebank_pos.html)
Also I used these resources to create my [valid English words list](https://github.com/Maximax67/English-Valid-Words):
- [Word list by infochimps (archived)](https://web.archive.org/web/20131118073324/https://www.infochimps.com/datasets/word-list-350000-simple-english-words-excel-readable)
- [English words github repo by dwyl](https://github.com/dwyl/english-words)
- [NLTK (Natural Language Toolkit)](https://www.nltk.org/)
- [WordNet](https://wordnet.princeton.edu/) |
ibranze/araproje_hellaswag_en_f1 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 0
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_f1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
darrel999/java-file-1107 | ---
dataset_info:
features:
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 5108304
num_examples: 1107
download_size: 1329459
dataset_size: 5108304
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare | ---
pretty_name: Evaluation run of louisbrulenaudet/Pearl-7B-0210-dare
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [louisbrulenaudet/Pearl-7B-0210-dare](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-dare)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T16:34:01.841503](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare/blob/main/results_2024-02-11T16-34-01.841503.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6238865006281977,\n\
\ \"acc_stderr\": 0.03273771754882751,\n \"acc_norm\": 0.6230212223075242,\n\
\ \"acc_norm_stderr\": 0.03342419316334821,\n \"mc1\": 0.5752753977968176,\n\
\ \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.7146460351141442,\n\
\ \"mc2_stderr\": 0.015047159780719415\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238361,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907597\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7264489145588529,\n\
\ \"acc_stderr\": 0.004448701611795089,\n \"acc_norm\": 0.8879705238000398,\n\
\ \"acc_norm_stderr\": 0.003147581209374547\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n\
\ \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n\
\ \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n \"\
acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.02564938106302926,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.02564938106302926\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593566,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593566\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200154,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516301,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516301\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n\
\ \"acc_stderr\": 0.014385525076611578,\n \"acc_norm\": 0.7969348659003831,\n\
\ \"acc_norm_stderr\": 0.014385525076611578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.016018239710513398,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.016018239710513398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297243,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297243\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902164,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902164\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599923,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599923\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5752753977968176,\n\
\ \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.7146460351141442,\n\
\ \"mc2_stderr\": 0.015047159780719415\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433542\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \
\ \"acc_stderr\": 0.013270100238748831\n }\n}\n```"
repo_url: https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-dare
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|arc:challenge|25_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|gsm8k|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hellaswag|10_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T16-34-01.841503.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- '**/details_harness|winogrande|5_2024-02-11T16-34-01.841503.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T16-34-01.841503.parquet'
- config_name: results
data_files:
- split: 2024_02_11T16_34_01.841503
path:
- results_2024-02-11T16-34-01.841503.parquet
- split: latest
path:
- results_2024-02-11T16-34-01.841503.parquet
---
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-dare
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0210-dare](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T16:34:01.841503](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare/blob/main/results_2024-02-11T16-34-01.841503.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6238865006281977,
"acc_stderr": 0.03273771754882751,
"acc_norm": 0.6230212223075242,
"acc_norm_stderr": 0.03342419316334821,
"mc1": 0.5752753977968176,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.7146460351141442,
"mc2_stderr": 0.015047159780719415
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.013512058415238361,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907597
},
"harness|hellaswag|10": {
"acc": 0.7264489145588529,
"acc_stderr": 0.004448701611795089,
"acc_norm": 0.8879705238000398,
"acc_norm_stderr": 0.003147581209374547
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593566,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593566
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516301,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516301
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611578,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513398,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297243,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297243
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902164,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902164
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599923,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5752753977968176,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.7146460351141442,
"mc2_stderr": 0.015047159780719415
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433542
},
"harness|gsm8k|5": {
"acc": 0.6338134950720242,
"acc_stderr": 0.013270100238748831
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DavronSherbaev/uzbekvoice | ---
license: apache-2.0
task_categories:
- automatic-speech-recognition
language:
- uz
size_categories:
- 100K<n<1M
--- |
urialon/converted_narrative_qa | ---
dataset_info:
features:
- name: id
dtype: string
- name: pid
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 18019445085
num_examples: 55003
- name: validation
num_bytes: 1900648400
num_examples: 5878
- name: test
num_bytes: 3228274423
num_examples: 10306
download_size: 8524652529
dataset_size: 23148367908
---
# Dataset Card for "converted_narrative_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kraykzzz/Voz | ---
license: openrail
---
|
bdsaglam/musique-answerable-2hop-subset-jerx-reward | ---
dataset_info:
features:
- name: id
dtype: string
- name: jerx.input
dtype: string
- name: jerx.output
dtype: string
- name: reward
dtype: int64
splits:
- name: train
num_bytes: 86982
num_examples: 110
download_size: 0
dataset_size: 86982
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nicolas-BZRD/uld_loss_Llama-2-7b-chat-hf-qed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
list:
- name: end
dtype: int64
- name: start
dtype: int64
- name: string
dtype: string
- name: answers_generated
dtype: string
splits:
- name: train
num_bytes: 5124485
num_examples: 7007
- name: validation
num_bytes: 449998
num_examples: 610
download_size: 3652432
dataset_size: 5574483
---
# Dataset Card for "uld_loss_Llama-2-7b-chat-hf-qed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.