datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
yeshpanovrustem/ner-kazakh | ---
language:
- kk
license: cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: kaznerd
pretty_name: A Named Entity Recognition Dataset for Kazakh
viewer: true
dataset_info:
config_name: ner_kazakh
features:
- name: index
dtype: string
- name: sentence_id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-ADAGE
'2': I-ADAGE
'3': B-ART
'4': I-ART
'5': B-CARDINAL
'6': I-CARDINAL
'7': B-CONTACT
'8': I-CONTACT
'9': B-DATE
'10': I-DATE
'11': B-DISEASE
'12': I-DISEASE
'13': B-EVENT
'14': I-EVENT
'15': B-FACILITY
'16': I-FACILITY
'17': B-GPE
'18': I-GPE
'19': B-LANGUAGE
'20': I-LANGUAGE
'21': B-LAW
'22': I-LAW
'23': B-LOCATION
'24': I-LOCATION
'25': B-MISCELLANEOUS
'26': I-MISCELLANEOUS
'27': B-MONEY
'28': I-MONEY
'29': B-NON_HUMAN
'30': I-NON_HUMAN
'31': B-NORP
'32': I-NORP
'33': B-ORDINAL
'34': I-ORDINAL
'35': B-ORGANISATION
'36': I-ORGANISATION
'37': B-PERSON
'38': I-PERSON
'39': B-PERCENTAGE
'40': I-PERCENTAGE
'41': B-POSITION
'42': I-POSITION
'43': B-PRODUCT
'44': I-PRODUCT
'45': B-PROJECT
'46': I-PROJECT
'47': B-QUANTITY
'48': I-QUANTITY
'49': B-TIME
'50': I-TIME
splits:
- name: train
num_bytes: 26219395
num_examples: 88540
- name: validation
num_bytes: 3268409
num_examples: 11067
- name: test
num_bytes: 3252196
num_examples: 11068
download_size: 9016377
dataset_size: 32740000
configs:
- config_name: ner_kazakh
data_files:
- split: train
path: ner_kazakh/train-*
- split: validation
path: ner_kazakh/validation-*
- split: test
path: ner_kazakh/test-*
---
# A Named Entity Recognition Dataset for Kazakh
- This is a modified version of the dataset provided in the [LREC 2022](https://lrec2022.lrec-conf.org/en/) paper [*KazNERD: Kazakh Named Entity Recognition Dataset*](https://aclanthology.org/2022.lrec-1.44).
- The original repository for the paper can be found at *https://github.com/IS2AI/KazNERD*.
- Tokens denoting speech disfluencies and hesitations (parenthesised) and background noise [bracketed] were removed.
- A total of 2,027 duplicate sentences were removed.
## Dataset Description
- **Homepage:** [homepage](https://issai.nu.edu.kz/kaznerd-eng/)
- **Repository:** [github](https://github.com/IS2AI/KazNERD)
- **Paper:** [paper](https://aclanthology.org/2022.lrec-1.44)
- **Point of Contact:** [Rustem Yeshpanov](rustem.yeshpanov@nu.edu.kz)
### Statistics for training (Train), validation (Valid), and test (Test) sets
| Unit | Train | Valid | Test | Total |
| :---: | :---: | :---: | :---: | :---: |
| Sentence | 88,540 (80.00%) | 11,067 (10.00%) | 11,068 (10.00%) | 110,675 (100%) |
| Token | 1,088,461 (80.04%) | 136,021 (10.00%) | 135,426 (9.96%) | 1,359,908 (100%) |
| NE | 106,148 (80.17%) | 13,189 (9.96%) | 13,072 (9.87%) | 132,409 (100%) |
### 80 / 10 / 10 split
|Representation| Train | Valid | Test | Total |
| :---: | :---: | :---: | :---: | :---: |
| **AID** | 67,582 (79.99%) | 8,439 (9.99%) | 8,467 (10.02%)| 84,488 (100%) |
| **BID** | 19,006 (80.11%) | 2,380 (10.03%) | 2,338 (9.85%)| 23,724 (100%) |
| **CID** | 1,050 (78.89%) | 138 (10.37%) | 143 ( 10.74%) | 1,331 (100%) |
| **DID** | 633 (79.22%) | 82 (10.26%) | 84 (10.51%) | 799 (100%) |
| **EID** | 260 (81.00%) | 27 (8.41%) | 34 (10.59%)| 321 (100%) |
| **FID** | 9 (75.00%) | 1 (8.33%)| 2 (16.67%)| 12 (100%) |
|**Total**| **88,540 (80.00%)** | **11,067 (10.00%)** | **11,068 (10.00%)** | **110,675 (100%)** |
### Distribution of representations across sets
|Representation| Train | Valid | Test | Total |
| :---: | :---: | :---: | :---: | :---: |
| **AID** | 67,582 (76.33%) | 8,439 (76.25%) | 8,467 (76.50%)| 84,488 (76.34%) |
| **BID** | 19,006 (21.47%) | 2,380 (21.51%) | 2,338 (21.12%)| 23,724 (21.44%) |
| **CID** | 1,050 (1.19%) | 138 (1.25%) | 143 ( 1.29%) | 1,331 (1.20%) |
| **DID** | 633 (0.71%) | 82 (0.74%) | 84 (0.76%) | 799 (0.72%) |
| **EID** | 260 (0.29%) | 27 (0.24%) | 34 (0.31%)| 321 (0.29%) |
| **FID** | 9 (0.01%) | 1 (0.01%)| 2 (0.02%)| 12 (0.01%) |
|**Total**| **88,540 (100.00%)** | **11,067 (10.00%)** | **11,068 (10.00%)** | **110,675 (100%)** |
### Distribution of NEs across sets
| **NE Class** | **Train** | **Valid** | **Test** | **Total** |
|:---:| :---: | :---: | :---: | :---: |
| **ADAGE** | 153 (0.14%) | 19 (0.14%) | 17 (0.13%) | 189 (0.14%) |
| **ART** | 1,533 (1.44%) | 155 (1.18%) | 161 (1.23%) | 1,849 (1.40%) |
| **CARDINAL** | 23,135 (21.8%) | 2,878 (21.82%) | 2,789 (21.34%) | 28,802 (21.75%) |
| **CONTACT** | 159 (0.15%) | 18 (0.14%) | 20 (0.15%) | 197 (0.15%) |
| **DATE** | 20,006 (18.85%) | 2,603 (19.74%) | 2,584 (19.77%) | 25,193 (19.03%) |
| **DISEASE** | 1,022 (0.96%) | 121 (0.92%) | 119 (0.91%) | 1,262 (0.95%) |
| **EVENT** | 1,331 (1.25%) | 154 (1.17%) | 154 (1.18%) | 1,639 (1.24%) |
| **FACILITY** | 1,723 (1.62%) | 178 (1.35%) | 197 (1.51%) | 2,098 (1.58%) |
| **GPE** | 13,625 (12.84%) | 1,656 (12.56%) | 1,691 (12.94%) | 16,972 (12.82%) |
| **LANGUAGE** | 350 (0.33%) | 47 (0.36%) | 41 (0.31%) | 438 (0.33%) |
| **LAW** | 419 (0.39%) | 56 (0.42%) | 55 (0.42%) | 530 (0.40%) |
| **LOCATION** | 1,736 (1.64%) | 210 (1.59%) | 208 (1.59%) | 2,154 (1.63%) |
| **MISCELLANEOUS** | 191 (0.18%) | 26 (0.2%) | 26 (0.2%) | 243 (0.18%) |
| **MONEY** | 3,652 (3.44%) | 455 (3.45%) | 427 (3.27%) | 4,534 (3.42%) |
| **NON_HUMAN** | 6 (0.01%) | 1 (0.01%) | 1 (0.01%) | 8 (0.01%) |
| **NORP** | 2,929 (2.76%) | 374 (2.84%) | 368 (2.82%) | 3,671 (2.77%) |
| **ORDINAL** | 3,054 (2.88%) | 385 (2.92%) | 382 (2.92%) | 3,821 (2.89%) |
| **ORGANISATION** | 5,956 (5.61%) | 753 (5.71%) | 718 (5.49%) | 7,427 (5.61%) |
| **PERCENTAGE** | 3,357 (3.16%) | 437 (3.31%) | 462 (3.53%) | 4,256 (3.21%) |
| **PERSON** | 9,817 (9.25%) | 1,175 (8.91%) | 1,151 (8.81%) | 12,143 (9.17%) |
| **POSITION** | 4,844 (4.56%) | 587 (4.45%) | 597 (4.57%) | 6,028 (4.55%) |
| **PRODUCT** | 586 (0.55%) | 73 (0.55%) | 75 (0.57%) | 734 (0.55%) |
| **PROJECT** | 1,681 (1.58%) | 209 (1.58%) | 206 (1.58%) | 2,096 (1.58%) |
| **QUANTITY** | 3,063 (2.89%) | 411 (3.12%) | 403 (3.08%) | 3,877 (2.93%) |
| **TIME** | 1,820 (1.71%) | 208 (1.58%) | 220 (1.68%) | 2,248 (1.70%) |
| **Total** | **106,148 (100%)** | **13,189 (100%)** | **13,072 (100%)** | **132,409 (100%)** | |
totally-not-an-llm/airoboros-1.4.1-graded | ---
license: other
license_name: airoboros
license_link: LICENSE
---
|
pharaouk/cortex_3 | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: responses
dtype: string
splits:
- name: train
num_bytes: 22225852
num_examples: 9807
download_size: 11066541
dataset_size: 22225852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dongyoung4091/shp-generated_flan_t5_rx_xl_all | ---
dataset_info:
features:
- name: response
dtype: string
- name: prompt
dtype: string
- name: model_A
dtype: float64
- name: model_B
dtype: float64
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 27460355
num_examples: 25600
download_size: 2234625
dataset_size: 27460355
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shp-generated_flan_t5_rx_xl_all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MemGPT/MemGPT-DPO-Dataset | ---
task_categories:
- text-generation
language:
- en
tags:
- function calling
- function
- memgpt
pretty_name: MemGPT-DPO-Dataset
size_categories:
- 10K<n<100K
---

**MemGPT-DPO-Dataset** is our initial release of a potential series of datasets.
*Please check* ***"files"*** *tab for other languages!*
## Details
The dataset is synthetically generated by **GPT-4**, led by [@starsnatched](https://huggingface.co/starsnatched) and [@cpacker](https://huggingface.co/cpacker).
This dataset is intended to be used with **text-generation models**, such as [Mistral-7B-Instruct](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2). The dataset allows the LLM to learn to use [MemGPT-specific tools](https://memgpt.readme.io/docs/presets).
#### → Features
Teaches an LLM to prefer a function over the other.
#### → Dataset size & splits
The dataset in this repository contains **42,293 rows**, with the only split being **train split**.
#### → Data annotation
**Prompt**: The examples of potential user-queries.\
**Chosen**: The name of the function that the LLM should prefer.\
**Rejected**: The name of the function that the LLM should **NOT** prefer.
#### → Data collection process
This dataset is **entirely generated by GPT-4** using prompt engineering.
#### → Data cleaning
Quick manual examination was performed on the dataset and **some** pairs were removed due to unwanted preferation of function. There was **no harmful content** that was spotted during the examination.
#### → Use cases
This dataset is mainly intended for **DPO** fine-tuning of an LLM. However, this can be used for **SFT** fine-tuning as well.
## Code Snippet (examples)
Below is an example Python code to map the given dataset into **ChatML** format:
```python
def chatml_format(example):
prompt = "<|im_start|>user\n{\n \"type\": \"user_message\",\n \"message\": \"" + example['prompt'] + "\",\n \"time\": \"" + f"{generate_random_time()}" + "\"\n}<|im_end|>\n<|im_start|>assistant\n"
chosen = '{\n "function": "' + example['chosen'] + '",'
rejected = '{\n "function": "' + example['rejected'] + '",'
return {
"prompt": prompt,
"chosen": chosen,
"rejected": rejected,
}
def generate_random_time():
year = random.randint(2024, 2025)
month = random.randint(1, 12)
day = random.randint(1, 28)
hour = random.randint(1, 12)
minute = random.randint(0, 59)
second = random.randint(0, 59)
am_pm = random.choice(['AM', 'PM'])
dt = datetime(year, month, day, hour, minute, second)
formatted_time = dt.strftime("%Y-%m-%d %I:%M:%S %p")
formatted_time = formatted_time[:-3] + " " + am_pm
return formatted_time
```
The above code should return the partial prompt-output pair as such:
```
# Chosen example
<|im_start|>user
{
"type": "user_message",
"message": "EXAMPLE USER PROMPT",
"time": "RANDOM TIME GENERATED"
}<|im_end|>
<|im_start|>assistant
{
"function": "EXAMPLE FUNCTION", # The assistant generates from here.
```
## Motivation
We found that on MemGPT, using GPT-4 is not very cost-efficient. Some users have reported that after just a dozen conversation turns, their OpenAI usage bills reached **above $1-2**. However, using open-source models, users have also reported that the models are **not as performant** compared to GPT-4, sometimes calling the wrong function, or most of the time, not calling the necessary function at all. In order to combat this potential deal-breaker for most people, we decided to create (fine-tune) an LLM that is specifically trained to be used on MemGPT. We aim to create an LLM that can **surpass GPT-4**'s function calling capabilities when being used with MemGPT, and hopefully assist other users create their own MemGPT-LLM using our dataset. |
BiMediX/mmlu-college_biology-arabic | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 72743
num_examples: 144
download_size: 37465
dataset_size: 72743
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mr-TD/MOM-Summary-Dataset | ---
dataset_info:
features:
- name: Meeting Transcript
dtype: string
- name: Summary
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3761645
num_examples: 767
download_size: 1426442
dataset_size: 3761645
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cookinai/TRRR-CoT | ---
license: apache-2.0
tags:
- synthetic
---
TRRR
1. **Think**, about your response
2. **Respond**, how you normally would
3. **Reflect**, on your response
4. **Respond**, again but this time use all the information you have now
The inputs are from the high quality CoT dataset, Locutusque/OpenCerebrum-SFT and the outputs were generated by Mixtral (with Groq!!) but formatted with this TRRR in an attempt to improve it's responses.
Awaiting benchmarks to test this way to apply CoT to a model. |
harshamuthukuru/pneumonia | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 363694853.625
num_examples: 3875
download_size: 331363651
dataset_size: 363694853.625
---
# Dataset Card for "pneumonia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yogesh0502/cuad_v1 | ---
license: cc-by-4.0
---
|
liuyanchen1015/MULTI_VALUE_qqp_were_was | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 84888
num_examples: 427
- name: test
num_bytes: 731120
num_examples: 3927
- name: train
num_bytes: 776350
num_examples: 4014
download_size: 922236
dataset_size: 1592358
---
# Dataset Card for "MULTI_VALUE_qqp_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hyuuga_hanabi_naruto | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hyuuga_hanabi (NARUTO)
This is the dataset of hyuuga_hanabi (NARUTO), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
KETI-AIR/kor_dbpedia_14 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: data_index_by_user
dtype: int32
- name: title
dtype: string
- name: content
dtype: string
- name: label
dtype: int32
splits:
- name: train
num_bytes: 207331112
num_examples: 560000
- name: test
num_bytes: 25970187
num_examples: 70000
download_size: 136871622
dataset_size: 233301299
license: cc-by-sa-3.0
---
# Dataset Card for "kor_dbpedia_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
Lehmann, Jens, Robert Isele, Max Jakob, Anja Jentzsch, Dimitris Kontokostas, Pablo N. Mendes, Sebastian Hellmann et al. "DBpedia–a large-scale, multilingual knowledge base extracted from Wikipedia." Semantic web 6, no. 2 (2015): 167-195.
``` |
ll00292007/lora | ---
license: other
---
|
andersonbcdefg/inpars_generated_query_pairs_cf | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 402932617.342133
num_examples: 373879
download_size: 188457477
dataset_size: 402932617.342133
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_261 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 938582900.0
num_examples: 184325
download_size: 955854329
dataset_size: 938582900.0
---
# Dataset Card for "chunk_261"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bluebomber182/Princess-Merida-From-Brave | ---
license: unknown
---
|
open-llm-leaderboard/details_gagan3012__Multirial | ---
pretty_name: Evaluation run of gagan3012/Multirial
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gagan3012/Multirial](https://huggingface.co/gagan3012/Multirial) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__Multirial\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T02:38:13.132787](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multirial/blob/main/results_2024-01-14T02-38-13.132787.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6087068516861436,\n\
\ \"acc_stderr\": 0.032980911385021405,\n \"acc_norm\": 0.6135781515215905,\n\
\ \"acc_norm_stderr\": 0.03364558465127436,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5469648449991642,\n\
\ \"mc2_stderr\": 0.01540322430997804\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.01434686906022933,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168478\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6061541525592511,\n\
\ \"acc_stderr\": 0.0048760280379419405,\n \"acc_norm\": 0.7956582354112727,\n\
\ \"acc_norm_stderr\": 0.0040239573344619875\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n\
\ \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n\
\ \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n\
\ \"acc_stderr\": 0.03257901482099834,\n \"acc_norm\": 0.5404255319148936,\n\
\ \"acc_norm_stderr\": 0.03257901482099834\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"\
acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.02698528957655274,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.02698528957655274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"\
acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597542,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597542\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.01492744710193716,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.01492744710193716\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069706,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069706\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379772,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.01955964680921593,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.01955964680921593\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5469648449991642,\n\
\ \"mc2_stderr\": 0.01540322430997804\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855576\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4040940106141016,\n \
\ \"acc_stderr\": 0.013516752972721716\n }\n}\n```"
repo_url: https://huggingface.co/gagan3012/Multirial
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|arc:challenge|25_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|gsm8k|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hellaswag|10_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T02-38-13.132787.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- '**/details_harness|winogrande|5_2024-01-14T02-38-13.132787.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T02-38-13.132787.parquet'
- config_name: results
data_files:
- split: 2024_01_14T02_38_13.132787
path:
- results_2024-01-14T02-38-13.132787.parquet
- split: latest
path:
- results_2024-01-14T02-38-13.132787.parquet
---
# Dataset Card for Evaluation run of gagan3012/Multirial
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gagan3012/Multirial](https://huggingface.co/gagan3012/Multirial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gagan3012__Multirial",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T02:38:13.132787](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multirial/blob/main/results_2024-01-14T02-38-13.132787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6087068516861436,
"acc_stderr": 0.032980911385021405,
"acc_norm": 0.6135781515215905,
"acc_norm_stderr": 0.03364558465127436,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5469648449991642,
"mc2_stderr": 0.01540322430997804
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.01434686906022933,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168478
},
"harness|hellaswag|10": {
"acc": 0.6061541525592511,
"acc_stderr": 0.0048760280379419405,
"acc_norm": 0.7956582354112727,
"acc_norm_stderr": 0.0040239573344619875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.02491524398598785,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.02491524398598785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597542,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597542
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.01492744710193716,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.01492744710193716
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069706,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379772,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.01955964680921593,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.01955964680921593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5469648449991642,
"mc2_stderr": 0.01540322430997804
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855576
},
"harness|gsm8k|5": {
"acc": 0.4040940106141016,
"acc_stderr": 0.013516752972721716
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
keehuachin/cleaner | ---
dataset_info:
features:
- name: Input
dtype: string
- name: cleaner_text
dtype: string
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: labels
sequence: int64
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 136223322.9104233
num_examples: 8296
- name: test
num_bytes: 34072251.08957671
num_examples: 2075
download_size: 39442189
dataset_size: 170295574.0
---
# Dataset Card for "cleaner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jurnu/f | ---
license: bigscience-openrail-m
---
<a href="https://twitter.com/">twitter</a>
<a rel="alternate" href="https://twitter.com/">twitter</a>
<a rel="next" href="https://twitter.com/">twitter</a>
<a rel="prev" href="https://twitter.com/">twitter</a>
<a rel="amphtml" href="https://twitter.com/">twitter</a>
<a rel="follow" href="https://twitter.com/">twitter</a>
<a rel="author" href="https://twitter.com/">twitter</a>
<a rel="bookmark" href="https://twitter.com/">twitter</a>
<a rel="external" href="https://twitter.com/">twitter</a>
<a rel="license" href="https://twitter.com/">twitter</a>
<a rel="noreferrer" href="https://twitter.com/">twitter</a>
<a rel="noopener" href="https://twitter.com/">twitter</a>
<a rel="search" href="https://twitter.com/">twitter</a>
<a rel="tag" href="https://twitter.com/">twitter</a>
<a rel="sponsored" href="https://twitter.com/">twitter</a>
<a rel="ugc" href="https://twitter.com/">twitter</a>
<a rel="dofollow" href="https://twitter.com/">twitter</a>
[url]https://twitter.com/[/url]
[url=https://twitter.com/]twitter[/url]
|
AIRI-NLP/quality_counter_new_1536 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 550553150
num_examples: 20000
- name: validation
num_bytes: 226698164
num_examples: 8000
- name: test
num_bytes: 56238416
num_examples: 2300
download_size: 26414757
dataset_size: 833489730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Coooori/instruction_data_dev_hf | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1205865
num_examples: 1087
download_size: 234027
dataset_size: 1205865
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instruction_data_dev_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jaswir/tm-data | ---
license: apache-2.0
---
|
ravithejads/samvaad-hi-filtered | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 87235012
num_examples: 33371
download_size: 29394921
dataset_size: 87235012
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
han2lin/squad | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 69824466.34546056
num_examples: 77087
- name: valid
num_bytes: 9521641.654539436
num_examples: 10512
- name: test
num_bytes: 10472984
num_examples: 10570
download_size: 52413878
dataset_size: 89819092.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
nayohan/multi_session_chat | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: dataset
dtype: string
- name: dialoug_id
dtype: int64
- name: session_id
dtype: int64
- name: persona1
sequence: string
- name: persona2
sequence: string
- name: dialogue
sequence: string
- name: speaker
sequence: string
splits:
- name: train
num_bytes: 30863868
num_examples: 17940
- name: validation
num_bytes: 6329337
num_examples: 3000
- name: test
num_bytes: 5867348
num_examples: 2505
download_size: 0
dataset_size: 43060553
---
# Dataset Card for "multi_session_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_same_length_find_passage_train400_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 289948
num_examples: 840
- name: validation
num_bytes: 15536
num_examples: 40
download_size: 132781
dataset_size: 305484
---
# Dataset Card for "random_letter_same_length_find_passage_train400_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsstein/50-baseline-dataset-llama | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated
dtype: bool
- name: raw_summary
dtype: string
splits:
- name: train
num_bytes: 129500673
num_examples: 15326
- name: test
num_bytes: 4638887
num_examples: 576
- name: validation
num_bytes: 4921772
num_examples: 576
download_size: 85137871
dataset_size: 139061332
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
BennoKrojer/ImageCoDe | ---
license: afl-3.0
---
# Dataset Card for ImageCoDe
To get started quickly, load descriptions via:
```
from datasets import load_dataset
examples = load_dataset('BennoKrojer/ImageCoDe')
```
And download `image_sets.zip` for all images sets (each directory consisting of 10 images).
## Dataset Description
- **Homepage & Leaderboard:** https://mcgill-nlp.github.io/imagecode/
- **Repository:** https://github.com/McGill-NLP/imagecode
- **Paper:** https://arxiv.org/abs/2203.15867
- **Point of Contact:** benno DOT krojer ÄT gmail DOT com
### Dataset Summary
We introduce ImageCoDe, a vision-and-language benchmark that requires contextual language understanding in the form of pragmatics, temporality, long descriptions and visual nuances. The task: Given a detailed description, retrieve the target image among 10 minimally contrastive images. ImageCoDe contains 21K descriptions and 94K images. THe images are primarily frames based on video datasets.
## Dataset Structure
### Data Instances
An instance contains a description, the corresponding image set name, and the target index:
```
{"image_set": "video-storytelling-videowedding_de8dLXvgV-I-shot6_0",
"image_index": "8",
"description": "The flowers the woman in the teal strapless dress is carrying are completely obscured by the man in the black shirt's head. "}
```
### Data Splits
| Dataset Split | Number of Descriptions in Split |
| ------------- |----------------------------- |
| Train | 16,594 |
| Validation | 2,302 |
| Test | 2,306 |
## Dataset Creation
### Curation Rationale
The main goal of ImageCoDe is to highlight weaknesses of recent Vision-and-Language models regarding complex language and fine-grained visual representations. In addition, we found that the dataset offers plenty of pragmatic examples and is therefore suitable for studying pragmatics. |
the-coorporation/the_squad_qg | ---
license: wtfpl
dataset_info:
- config_name: v2
features:
- name: context
dtype: string
- name: questions
dtype: string
splits:
- name: train
num_bytes: 20328952
num_examples: 18877
- name: validation
num_bytes: 1419411
num_examples: 1204
download_size: 24163282
dataset_size: 21748363
- config_name: v1
features:
- name: context
dtype: string
- name: questions
dtype: string
splits:
- name: train
num_bytes: 20391081
num_examples: 18891
- name: validation
num_bytes: 2389185
num_examples: 2067
download_size: 25308169
dataset_size: 22780266
language:
- en
pretty_name: The SQuAD QG Dataset
---
# The SQuAD QG Dataset
## Description
[Stanford Question Answering Dataset (SQuAD)](https://rajpurkar.github.io/SQuAD-explorer/) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable.
This modified version is aimed at question generation;
each entry only contains contexts and questions concatenated to a single string related to the specific context.
`The SQuAD` unites SQuAD 1.1 and 2.0 in two subsets each containing a `train` and `validation` split.
## Dataset Structure
### Data Instances
An example entry looks as follows:
```python
{
context: "This is a test context",
questions: ["Is this a test?", "Is this a test context?"]
}
```
### Data Fields
The dataset has the following fields:
* context: a string feature
* questions: a string feature
**NB:** The data fields are the same among all splits.
### Data Splits
| name | train | validation |
|------|-------|------------|
| v1 | 18891 | 2067 |
| v2 | 18877 | 1204 |
|
krishanusinha20/marketing_emails | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 20941
num_examples: 10
download_size: 26509
dataset_size: 20941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "marketing_emails"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amitness/logits-arabic | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 16680302900
num_examples: 1059523
download_size: 5639945948
dataset_size: 16680302900
---
# Dataset Card for "logits-arabic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
caoc12581/jax | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 271658381.0
num_examples: 2
download_size: 113444578
dataset_size: 271658381.0
---
# Dataset Card for "whisper-jax-test-files"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BettercallSaulGM/crc_image_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 130747958.0
num_examples: 1000
download_size: 0
dataset_size: 130747958.0
---
# Dataset Card for "crc_image_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
katielink/race-based_medicine_questions | ---
license: cc-by-4.0
tags:
- medical
---
Questions used in the paper, Omiye et al (2023) ["Large language models propagate race-based medicine"](https://www.nature.com/articles/s41746-023-00939-z) |
Gladiaio/Instruct-Summary | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
task_categories:
- summarization
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for "Instruct-Summary"
This dataset is a combination of [kmfoda/booksum](https://huggingface.co/datasets/kmfoda/booksum), [samsum](https://huggingface.co/datasets/samsum/tree/main/data), [mosaicml/dolly_hhrlhf](https://huggingface.co/datasets/mosaicml/dolly_hhrlhf) and [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned). |
polinaeterna/list | ---
dataset_info:
features:
- name: list
sequence: int64
splits:
- name: train
num_bytes: 69
num_examples: 5
download_size: 1061
dataset_size: 69
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
moneim/uk-careers-prompts | ---
license: mit
---
|
amitness/logits-mt-512 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 195656401.36799684
num_examples: 10756
- name: test
num_bytes: 34543650.63200316
num_examples: 1899
download_size: 84854727
dataset_size: 230200052.0
---
# Dataset Card for "logits-mt-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1 | ---
pretty_name: Evaluation run of tenyx/TenyxChat-8x7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tenyx/TenyxChat-8x7B-v1](https://huggingface.co/tenyx/TenyxChat-8x7B-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T15:44:33.051558](https://huggingface.co/datasets/open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1/blob/main/results_2024-01-20T15-44-33.051558.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7101891541491676,\n\
\ \"acc_stderr\": 0.030258624657643698,\n \"acc_norm\": 0.7137715225758183,\n\
\ \"acc_norm_stderr\": 0.030842789389844256,\n \"mc1\": 0.5018359853121175,\n\
\ \"mc1_stderr\": 0.01750338304687705,\n \"mc2\": 0.6541929389144224,\n\
\ \"mc2_stderr\": 0.015163572290637445\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537298,\n\
\ \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6890061740689106,\n\
\ \"acc_stderr\": 0.004619542392006391,\n \"acc_norm\": 0.8776140211113324,\n\
\ \"acc_norm_stderr\": 0.003270612753613399\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838994,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838994\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8483870967741935,\n \"acc_stderr\": 0.02040261665441676,\n \"\
acc_norm\": 0.8483870967741935,\n \"acc_norm_stderr\": 0.02040261665441676\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6157635467980296,\n \"acc_stderr\": 0.03422398565657551,\n \"\
acc_norm\": 0.6157635467980296,\n \"acc_norm_stderr\": 0.03422398565657551\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695482995,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695482995\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.01349265975129515,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.01349265975129515\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205145,\n \
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205145\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.013708749534172636,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.013708749534172636\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017016,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017016\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281224,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281224\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.879948914431673,\n\
\ \"acc_stderr\": 0.011622736692041283,\n \"acc_norm\": 0.879948914431673,\n\
\ \"acc_norm_stderr\": 0.011622736692041283\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.021828596053108395,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.021828596053108395\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n\
\ \"acc_stderr\": 0.023093140398374224,\n \"acc_norm\": 0.7909967845659164,\n\
\ \"acc_norm_stderr\": 0.023093140398374224\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157365,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5443285528031291,\n\
\ \"acc_stderr\": 0.012719949543032228,\n \"acc_norm\": 0.5443285528031291,\n\
\ \"acc_norm_stderr\": 0.012719949543032228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.025035845227711274,\n\
\ \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.025035845227711274\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.026882144922307744,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.026882144922307744\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5018359853121175,\n\
\ \"mc1_stderr\": 0.01750338304687705,\n \"mc2\": 0.6541929389144224,\n\
\ \"mc2_stderr\": 0.015163572290637445\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435093\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.013428382481274249\n }\n}\n```"
repo_url: https://huggingface.co/tenyx/TenyxChat-8x7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|arc:challenge|25_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|gsm8k|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hellaswag|10_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T15-44-33.051558.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- '**/details_harness|winogrande|5_2024-01-20T15-44-33.051558.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T15-44-33.051558.parquet'
- config_name: results
data_files:
- split: 2024_01_20T15_44_33.051558
path:
- results_2024-01-20T15-44-33.051558.parquet
- split: latest
path:
- results_2024-01-20T15-44-33.051558.parquet
---
# Dataset Card for Evaluation run of tenyx/TenyxChat-8x7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tenyx/TenyxChat-8x7B-v1](https://huggingface.co/tenyx/TenyxChat-8x7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T15:44:33.051558](https://huggingface.co/datasets/open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1/blob/main/results_2024-01-20T15-44-33.051558.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7101891541491676,
"acc_stderr": 0.030258624657643698,
"acc_norm": 0.7137715225758183,
"acc_norm_stderr": 0.030842789389844256,
"mc1": 0.5018359853121175,
"mc1_stderr": 0.01750338304687705,
"mc2": 0.6541929389144224,
"mc2_stderr": 0.015163572290637445
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537298,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.6890061740689106,
"acc_stderr": 0.004619542392006391,
"acc_norm": 0.8776140211113324,
"acc_norm_stderr": 0.003270612753613399
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838994,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838994
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8483870967741935,
"acc_stderr": 0.02040261665441676,
"acc_norm": 0.8483870967741935,
"acc_norm_stderr": 0.02040261665441676
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695482995,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695482995
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.01349265975129515,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.01349265975129515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205145,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205145
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.013708749534172636,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.013708749534172636
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017016,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017016
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281224,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281224
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.625,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436193,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436193
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.879948914431673,
"acc_stderr": 0.011622736692041283,
"acc_norm": 0.879948914431673,
"acc_norm_stderr": 0.011622736692041283
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.021828596053108395,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.021828596053108395
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7909967845659164,
"acc_stderr": 0.023093140398374224,
"acc_norm": 0.7909967845659164,
"acc_norm_stderr": 0.023093140398374224
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157365,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5443285528031291,
"acc_stderr": 0.012719949543032228,
"acc_norm": 0.5443285528031291,
"acc_norm_stderr": 0.012719949543032228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7830882352941176,
"acc_stderr": 0.025035845227711274,
"acc_norm": 0.7830882352941176,
"acc_norm_stderr": 0.025035845227711274
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5018359853121175,
"mc1_stderr": 0.01750338304687705,
"mc2": 0.6541929389144224,
"mc2_stderr": 0.015163572290637445
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435093
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274249
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nblinh63/twitter_dataset_1712696764 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 80868
num_examples: 200
download_size: 38701
dataset_size: 80868
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
livinNector/ta-news-corp | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: tamil_murasu
num_bytes: 499641675
num_examples: 263669
- name: dinamalar
num_bytes: 5225297151
num_examples: 4125162
download_size: 1955475887
dataset_size: 5724938826
---
# Dataset Card for "ta-news-corp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/2M_Ceramic_Vasa_SDXL_Refiner_Prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1014335233
num_examples: 2000000
download_size: 95271776
dataset_size: 1014335233
---
# Dataset Card for "2M_Ceramic_Vasa_SDXL_Refiner_Prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jrahn/yolochess_lichess-elite_2211 | ---
dataset_info:
features:
- name: fen
dtype: string
- name: move
dtype: string
- name: result
dtype: string
- name: eco
dtype: string
splits:
- name: train
num_bytes: 1794337922
num_examples: 22116598
download_size: 1044871571
dataset_size: 1794337922
task_categories:
- text-classification
- reinforcement-learning
license: cc
tags:
- chess
size_categories:
- 10M<n<100M
---
# Dataset Card for "yolochess_lichess-elite_2211"
Source: https://database.nikonoel.fr/ - filtered from https://database.lichess.org for November 2022
Features:
- fen = Chess board position in [FEN](https://en.wikipedia.org/wiki/Forsyth%E2%80%93Edwards_Notation) format
- move = Move played by a strong human player in this position
- result = Final result of the match
- eco = [ECO](https://en.wikipedia.org/wiki/Encyclopaedia_of_Chess_Openings)-code of the Opening played
Samples: 22.1 million |
llFOZll/Debt_sellement_Prosolvo_fine_tunning | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- finance
pretty_name: Prosolvo_debt_settlement
size_categories:
- n<1K
--- |
mattymchen/natural-instruction-399 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 312100
num_examples: 2899
download_size: 223665
dataset_size: 312100
---
# Dataset Card for "natural-instruction-399"
## Dataset Description
In this task you are given a tweet. You must judge whether the author of the tweet is sad or not. Label the instances as "Sad" or "Not sad" based on your judgment. You can get help from hashtags and emojis, but you should not judge only based on them, and should pay attention to tweet\'s text as well.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hervezossou/africanvoice | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: audio_id
dtype: string
- name: transcription
dtype: string
- name: normalized_text
dtype: string
splits:
- name: train
num_bytes: 40041320.0
num_examples: 542
download_size: 38275595
dataset_size: 40041320.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dyxohjl666/ACL_CocoScisum | ---
configs:
- config_name: default
data_files:
- split: noctrl
path: "noctrl.csv"
- split: 2023_len50
path: "acl_results_len100_2023.csv"
--- |
jspr/symbolic-jazz-standards | ---
dataset_info:
features:
- name: instrument_type
dtype: string
- name: remi.tokens
sequence: string
- name: remi.ids
sequence: int64
- name: midilike.tokens
sequence: string
- name: midilike.ids
sequence: int64
- name: tsd.tokens
sequence: string
- name: tsd.ids
sequence: int64
- name: song_title
dtype: string
splits:
- name: train
num_bytes: 125868320
num_examples: 709
download_size: 10604547
dataset_size: 125868320
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- audio-to-audio
tags:
- music
---
# Symbolic Jazz Standards
A symbolic-domain music dataset of jazz standards, transcribed stem by stem from the audio domain into the symbolic domain. The dataset contains the equivalent of 10,000 minutes of audio from ~200 public-domain well-known songs.
## Methodology
To create this dataset, recordings of public-domain jazz standards were downloaded and separated into their component stems using the venerable [Demucs](https://github.com/facebookresearch/demucs) source separation library in 4-stem mode. The resulting stems are:
```
vocals
bass
drums
other
```
The resulting audio-domain stems are then fed through a proprietary polyphonic music transcription model to obtain the stems' corresponding symbolic-domain representations — that is, the notes that are being played or sung in the music.
The transcriptions are polyphonic for 'other' stems, percussive for drum stems, and monophonic for vocal and bass stems.
Finally, the raw symbolic-domain data is tokenized via the following strategies:
- [REMI](https://miditok.readthedocs.io/en/v3.0.1/tokenizations.html#remi)
- [MIDI-like](https://miditok.readthedocs.io/en/v3.0.1/tokenizations.html#midi-like)
- [TSD](https://miditok.readthedocs.io/en/v3.0.1/tokenizations.html#tsd)
This is performed with the excellent [MidiTok](https://github.com/Natooz/MidiTok) library.
## Dataset Structure
The dataset has the following columns:
- `song_title`: the title of the transcribed song, possibly including information about the performing artist.
- `instrument_type`: one of `vocals`, `bass`, `drums`, or `other`
- `remi.tokens`: a list of strings containing human-readable music tokens, in REMI format
- `remi.ids`: a list of integers representing machine-readable music tokens, in REMI format
- `midilike.tokens`: a list of strings containing human-readable music tokens, in MIDI-like format
- `midilike.ids`: a list of integers representing machine-readable music tokens, in MIDI-like format
- `tsd.tokens`: a list of strings containing human-readable music tokens, in TSD format
- `tsd.ids`: a list of integers representing machine-readable music tokens, in TSD format
## Uses
This dataset is intended for fine-tuning or pre-training generative symbolic-domain music models, or for jointly conditioning audio-domain music models on the underlying symbolic-domain data.
## Contact
For more info on this dataset, or to inquire about building similar datasets for your audio-domain data, please reach out to hello@atonaldata.com, or visit https://atonaldata.com
## Legal Disclaimer
<details>
The user of this dataset ("User") assumes all responsibility and risk for the use of this dataset. The User agrees to indemnify, defend, and hold harmless Atonal Data, its affiliates, officers, directors, employees, consultants, agents, and representatives from any and all third party claims, losses, liability, damages, and/or costs (including reasonable attorney fees and costs) arising from the User's access to or use of the dataset, violation of this Agreement, or infringement of any intellectual property or other right of any person or entity.
Atonal Data provides this dataset on an "as is" basis without any express or implied warranties, including, but not limited to, warranties of merchantability or fitness for a particular purpose. In no event shall Atonal Data be liable for any direct, indirect, incidental, punitive, or consequential damages of any kind whatsoever with respect to the dataset.
This dataset is compiled under the doctrine of fair use, and it is the User's responsibility to ensure that their use of the dataset does not infrive upon any copyright laws. All songs contained in this dataset are believed to be in the public domain. However, Atonal Data does not warrant or represent that use of the dataset will not infringe rights of third parties. The User is responsible for ensuring that their use of this dataset complies with all applicable laws and regulations.
</details>
## Citation
```
@misc{symbolicjazzstandards,
title={Symbolic Jazz Standards},
author={Atonal Data},
year={2024},
}
``` |
nastyboget/gan_hkr | ---
license: mit
task_categories:
- image-to-text
language:
- ru
size_categories:
- 100K<n<1M
---
Dataset generated from HKR train set using ScrabbleGAN
======================================================
Number of images: 300000
Sources:
* [HKR dataset](https://github.com/abdoelsayed2016/HKR_Dataset)
* [ScrabbleGAN code](https://github.com/ai-forever/ScrabbleGAN) |
BangumiBase/hinamatsuri | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Hinamatsuri
This is the image base of bangumi Hinamatsuri, we detected 23 characters, 1820 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 107 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 93 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 11 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 342 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 216 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 40 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 27 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 90 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 39 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 24 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 28 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 64 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 30 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 284 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 51 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 14 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 217 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 28 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 25 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 9 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 30 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 8 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 43 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
hojzas/proj4-match_permutations_substrings-lab1 | ---
license: apache-2.0
---
|
KeshavRa/About_YSA_Database | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 11938
num_examples: 57
download_size: 7711
dataset_size: 11938
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713097341 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11799
num_examples: 30
download_size: 13758
dataset_size: 11799
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713097341"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/Boardgame-QA | ---
license: cc-by-4.0
dataset_info:
features:
- name: proof
dtype: string
- name: example
dtype: string
- name: label
dtype: string
- name: rules
dtype: string
- name: preferences
dtype: string
- name: theory
dtype: string
- name: goal
dtype: string
- name: facts
dtype: string
- name: config
dtype: string
splits:
- name: test
num_bytes: 54209160
num_examples: 15000
- name: train
num_bytes: 55055604
num_examples: 15000
- name: valid
num_bytes: 27317650
num_examples: 7500
download_size: 34032485
dataset_size: 136582414
---
https://arxiv.org/pdf/2306.07934.pdf |
open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3 | ---
pretty_name: Evaluation run of NeverSleep/Noromaid-13b-v0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeverSleep/Noromaid-13b-v0.3](https://huggingface.co/NeverSleep/Noromaid-13b-v0.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T08:43:54.536488](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3/blob/main/results_2024-01-08T08-43-54.536488.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5677987077394565,\n\
\ \"acc_stderr\": 0.033653954046911065,\n \"acc_norm\": 0.5743169734927792,\n\
\ \"acc_norm_stderr\": 0.034368230343916395,\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5073138068542993,\n\
\ \"mc2_stderr\": 0.015726117257006858\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.01433223630679015,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6479784903405696,\n\
\ \"acc_stderr\": 0.004766245539606633,\n \"acc_norm\": 0.8441545508862777,\n\
\ \"acc_norm_stderr\": 0.0036196748640350256\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920935,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920935\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472434,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472434\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316455,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316455\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.02525448542479961,\n \
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.02525448542479961\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236153,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236153\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243739,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243739\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n\
\ \"acc_stderr\": 0.015302380123542106,\n \"acc_norm\": 0.7586206896551724,\n\
\ \"acc_norm_stderr\": 0.015302380123542106\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895817,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n\
\ \"acc_stderr\": 0.016666783616525772,\n \"acc_norm\": 0.4592178770949721,\n\
\ \"acc_norm_stderr\": 0.016666783616525772\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215355,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215355\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630988,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630988\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.020017629214213094,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.020017629214213094\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014638,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014638\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5073138068542993,\n\
\ \"mc2_stderr\": 0.015726117257006858\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2304776345716452,\n \
\ \"acc_stderr\": 0.011600249020595825\n }\n}\n```"
repo_url: https://huggingface.co/NeverSleep/Noromaid-13b-v0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|arc:challenge|25_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|arc:challenge|25_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|gsm8k|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|gsm8k|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hellaswag|10_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hellaswag|10_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T22-16-01.123734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T08-43-54.536488.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- '**/details_harness|winogrande|5_2024-01-07T22-16-01.123734.parquet'
- split: 2024_01_08T08_43_54.536488
path:
- '**/details_harness|winogrande|5_2024-01-08T08-43-54.536488.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T08-43-54.536488.parquet'
- config_name: results
data_files:
- split: 2024_01_07T22_16_01.123734
path:
- results_2024-01-07T22-16-01.123734.parquet
- split: 2024_01_08T08_43_54.536488
path:
- results_2024-01-08T08-43-54.536488.parquet
- split: latest
path:
- results_2024-01-08T08-43-54.536488.parquet
---
# Dataset Card for Evaluation run of NeverSleep/Noromaid-13b-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-13b-v0.3](https://huggingface.co/NeverSleep/Noromaid-13b-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T08:43:54.536488](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-13b-v0.3/blob/main/results_2024-01-08T08-43-54.536488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5677987077394565,
"acc_stderr": 0.033653954046911065,
"acc_norm": 0.5743169734927792,
"acc_norm_stderr": 0.034368230343916395,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5073138068542993,
"mc2_stderr": 0.015726117257006858
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.01433223630679015,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6479784903405696,
"acc_stderr": 0.004766245539606633,
"acc_norm": 0.8441545508862777,
"acc_norm_stderr": 0.0036196748640350256
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920935,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920935
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472434,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472434
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316455,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316455
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.02525448542479961,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.02525448542479961
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243739,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243739
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703643,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703643
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.015302380123542106,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.015302380123542106
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895817,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525772,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525772
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215355,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630988,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630988
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.020017629214213094,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.020017629214213094
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5073138068542993,
"mc2_stderr": 0.015726117257006858
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
},
"harness|gsm8k|5": {
"acc": 0.2304776345716452,
"acc_stderr": 0.011600249020595825
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jtatman/orca_minis_uncensored_squad_format | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 145821574
num_examples: 104179
download_size: 60327229
dataset_size: 145821574
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- question-answering
language:
- en
tags:
- squad
- orca
- subset
- refactor
- uncensored
- qa
- questions
pretty_name: uncensored_orca_subset_squad
size_categories:
- 10K<n<100K
---
# Dataset Card for "orca_minis_uncensored_squad_format"
This dataset is a part of a continued series providing interestingly formatted existing data from unrelated datasets for question/answering model use.
Alternately it can provide a common format that could be converted to something else easily using available scripts and utilities fairly easily.
### This is a work in progress and is changing every few days currently. Please refrain from using it for anything, especially seriously, unless a warning or example of atrocities are needed.
[Original Dataset](https://huggingface.co/datasets/psmathur/orca_minis_uncensored_dataset)
|
hounsouthohin/bears-fastai-2021 | ---
license: apache-2.0
---
|
alayaran/bodo-english-prompt-translation | ---
license: mit
---
|
MatsuoDochiai/Took1 | ---
license: openrail
---
|
gdfhjjytr/embeddings_tutorial_dataset | ---
license: mit
---
|
Sachin7/HomeTeamPrediction | ---
dataset_info:
features:
- name: date
dtype: string
- name: home_team
dtype: string
- name: away_team
dtype: string
- name: tournament
dtype: string
- name: city
dtype: string
- name: country
dtype: string
- name: neutral
dtype: bool
- name: result
dtype: int64
splits:
- name: train
num_bytes: 2665229.7
num_examples: 29162
- name: test
num_bytes: 1142241.3
num_examples: 12498
download_size: 1096165
dataset_size: 3807471.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ahishamm/isic_vit_db_cropped | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': benign
'1': keratosis
'2': melanoma
splits:
- name: train
num_bytes: 27772325.0
num_examples: 278
- name: test
num_bytes: 4737058.0
num_examples: 65
download_size: 32511407
dataset_size: 32509383.0
---
# Dataset Card for "isic_vit_db_cropped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dyhsup/CPR | ---
license: unknown
---
|
zambezivoice/zambezivoice_bem_text | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 937260
num_examples: 14121
download_size: 629604
dataset_size: 937260
---
# Dataset Card for "zambezivoice_bem_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ashish08/jacob-soni | ---
license: apache-2.0
language:
- en
pretty_name: My Dog - Jacob Soni
size_categories:
- n<1K
source_datasets:
- original
tags:
- 'images '
- pet
- dog
- german-shepherd
- dreambooth-hackathon
---
# Dataset Card for jacob-soni
## Dataset Description
The dataset contains of images my pet - Jacob, current age of 7 years.
### Dataset Curators
The data has been originally collected by Ashish Soni and his family.
### Licensing Information
The jacob-soni dataset version 1.0.0 is released under the Apache-2.0 License. |
sid-th26/prelims_question | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Option_A
dtype: string
- name: Option_B
dtype: string
- name: Option_C
dtype: string
- name: Option_D
dtype: string
- name: Explaination
dtype: string
- name: Answer
dtype: string
- name: Topic
dtype: string
- name: Subject
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 18912226
num_examples: 9999
download_size: 9002864
dataset_size: 18912226
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
imodels/diabetes-readmission | ---
annotations_creators: []
language: []
language_creators: []
license: []
multilinguality: []
pretty_name: diabetes-readmission
size_categories:
- 100K<n<1M
source_datasets: []
tags:
- interpretability
- fairness
- medicine
task_categories:
- tabular-classification
task_ids: []
---
Port of the diabetes-readmission dataset from UCI (link [here](https://archive.ics.uci.edu/ml/datasets/diabetes+130-us+hospitals+for+years+1999-2008)). See details there and use carefully.
Basic preprocessing done by the [imodels team](https://github.com/csinva/imodels) in [this notebook](https://github.com/csinva/imodels-data/blob/master/notebooks_fetch_data/00_get_datasets_custom.ipynb).
The target is the binary outcome `readmitted`.
### Sample usage
Load the data:
```
from datasets import load_dataset
dataset = load_dataset("imodels/diabetes-readmission")
df = pd.DataFrame(dataset['train'])
X = df.drop(columns=['readmitted'])
y = df['readmitted'].values
```
Fit a model:
```
import imodels
import numpy as np
m = imodels.FIGSClassifier(max_rules=5)
m.fit(X, y)
print(m)
```
Evaluate:
```
df_test = pd.DataFrame(dataset['test'])
X_test = df.drop(columns=['readmitted'])
y_test = df['readmitted'].values
print('accuracy', np.mean(m.predict(X_test) == y_test))
``` |
mlfoundations/datacomp_pools | ---
license: cc-by-4.0
---
## DataComp Pools
This repository contains metadata files for DataComp. For details on how to use the metadata, please visit [our website](https://www.datacomp.ai/) and our [github repository](https://github.com/mlfoundations/datacomp).
We distribute the image url-text samples and metadata under a standard Creative Common CC-BY-4.0 license. The individual images are under their own copyrights.
## Terms and Conditions
We have terms of service that are similar to those adopted by HuggingFace (https://huggingface.co/terms-of-service), which covers their dataset library. Specifically, any content you download, access or use from our index, is at your own risk and subject to the terms of service or copyright limitations accompanying such content. The image url-text index, which is a research artifact, is provided as is. By using said index, you assume all risks, including but not limited to, liabilities related to image downloading and storage.
|
hlillemark/c4_t5_100 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 534000
num_examples: 100
download_size: 257151
dataset_size: 534000
---
# Dataset Card for "c4_t5_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loubnabnl/dummy_data_clean | ---
dataset_info:
features:
- name: content
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: path
dtype: string
- name: annotation_id
dtype: string
- name: pii
dtype: string
- name: pii_modified
dtype: string
splits:
- name: train
num_bytes: 3808098.717948718
num_examples: 400
download_size: 1311649
dataset_size: 3808098.717948718
---
# Dataset Card for "dummy_data_clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
danielfrr/peccin | ---
license: openrail
---
|
DavidLanz/yentinglin-traditional_mandarin_instructions | ---
license: cc-by-sa-3.0
task_categories:
- question-answering
- summarization
- text-generation
language:
- zh
- en
size_categories:
- 10K<n<100K
---
Language Models for Taiwanese Culture training dataset.
## Citation
Please cite the repo if you use the data or code in this repo.
```
@inproceedings{lin-chen-2023-llm,
title = "{LLM}-Eval: Unified Multi-Dimensional Automatic Evaluation for Open-Domain Conversations with Large Language Models",
author = "Lin, Yen-Ting and Chen, Yun-Nung",
booktitle = "Proceedings of the 5th Workshop on NLP for Conversational AI (NLP4ConvAI 2023)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.nlp4convai-1.5",
pages = "47--58"
}
@misc{taiwanllama,
author={Lin, Yen-Ting and Chen, Yun-Nung},
title={Taiwanese-Aligned Language Models based on Meta-Llama2},
year={2023},
url={https://github.com/adamlin120/Taiwan-LLaMa},
note={Code and models available at https://github.com/adamlin120/Taiwan-LLaMa},
}
``` |
CyberHarem/nio_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nio/ニオ (Granblue Fantasy)
This is the dataset of nio/ニオ (Granblue Fantasy), containing 152 images and their tags.
The core tags of this character are `hair_over_one_eye, purple_hair, pointy_ears, long_hair, hair_ornament, ponytail, purple_eyes, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 152 | 176.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 152 | 109.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 354 | 236.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 152 | 162.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 354 | 321.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nio_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, harvin, solo, looking_at_viewer, navel_cutout, blush, cape, bare_shoulders, black_thighhighs, dress, breasts, simple_background, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, blush, hair_flower, harvin, obi, solo, looking_at_viewer, paper_fan, wide_sleeves, yukata, smile, blue_kimono, holding_fan, long_sleeves, parted_lips, small_breasts |
| 2 | 10 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, official_alternate_costume, open_mouth, paw_gloves, bangs, twintails, fang, pantyhose, very_long_hair, fur_trim, jack-o'-lantern, lion_tail, bow, claw_pose, halloween_costume, hood, orange_dress, red_necktie, braid, fake_animal_ears, harvin, sleeveless_dress, star_(symbol), white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | harvin | solo | looking_at_viewer | navel_cutout | blush | cape | bare_shoulders | black_thighhighs | dress | breasts | simple_background | white_background | hair_flower | obi | paper_fan | wide_sleeves | yukata | smile | blue_kimono | holding_fan | long_sleeves | parted_lips | small_breasts | official_alternate_costume | open_mouth | paw_gloves | bangs | twintails | fang | pantyhose | very_long_hair | fur_trim | jack-o'-lantern | lion_tail | bow | claw_pose | halloween_costume | hood | orange_dress | red_necktie | braid | fake_animal_ears | sleeveless_dress | star_(symbol) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:--------------------|:---------------|:--------|:-------|:-----------------|:-------------------|:--------|:----------|:--------------------|:-------------------|:--------------|:------|:------------|:---------------|:---------|:--------|:--------------|:--------------|:---------------|:--------------|:----------------|:-----------------------------|:-------------|:-------------|:--------|:------------|:-------|:------------|:-----------------|:-----------|:------------------|:------------|:------|:------------|:--------------------|:-------|:---------------|:--------------|:--------|:-------------------|:-------------------|:----------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | X | | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
lowem1/ocr_bert-training-2err | ---
dataset_info:
features:
- name: truth
dtype: string
- name: aug
dtype: string
- name: aug_type
dtype: string
- name: doc_tag
dtype: string
- name: distance
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 212731
num_examples: 1795
download_size: 31395
dataset_size: 212731
---
# Dataset Card for "ocr_bert-training-2err"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
p1atdev/nobodies | ---
license: cc0-1.0
---
# Nobodies
AI-generated human image dataset.
## Contents
### Face
- [vol1](https://huggingface.co/datasets/p1atdev/nobodies/blob/main/face/vol1.zip): 32 photos of women's faces. Generated with [WD1.5 beta 2](https://huggingface.co/waifu-diffusion/wd-1-5-beta2).
Sample:
<img class="max-w-lg" src="https://huggingface.co/datasets/p1atdev/nobodies/resolve/main/samples/face/vol1.jpg" />
### Portrait
- [vol1](https://huggingface.co/datasets/p1atdev/nobodies/blob/main/portrait/vol1.zip): 31 photos of women's portraits. Generated with [WD1.5 beta 2](https://huggingface.co/waifu-diffusion/wd-1-5-beta2) and the [fashion LoCon](https://huggingface.co/p1atdev/lora/blob/main/fashion-test1-e5.safetensors).
Sample:
<img class="max-w-lg" src="https://huggingface.co/datasets/p1atdev/nobodies/resolve/main/samples/portrait/vol1.jpg" />
- [vol2](https://huggingface.co/datasets/p1atdev/nobodies/blob/main/portrait/vol2.zip): 165 photos of woman's portraits. Generated with [WD1.5 beta 2](https://huggingface.co/waifu-diffusion/wd-1-5-beta2) and the [fashion LoCon](https://huggingface.co/p1atdev/lora/blob/main/fashion-test1-e5.safetensors). Classified with LAION Aesthetic v2.
- 75 hair bun photos
- 90 medium hair photos
<div class="flex overflow-scroll">
<img class="max-w-lg" src="https://huggingface.co/datasets/p1atdev/nobodies/resolve/main/samples/portrait/vol2-a.jpg" />
<img class="max-w-lg" src="https://huggingface.co/datasets/p1atdev/nobodies/resolve/main/samples/portrait/vol2-b.jpg" />
</div>
|
fhai50032/HINGLISH-LIMA | ---
dataset_info:
features:
- name: index
dtype: int64
- name: Hinglish
dtype: string
- name: English
sequence: string
splits:
- name: train
num_bytes: 5971514
num_examples: 1330
download_size: 3185547
dataset_size: 5971514
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
eai6/bungoma_training | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 33055806.0
num_examples: 315
- name: test
num_bytes: 6265849.0
num_examples: 36
download_size: 31141651
dataset_size: 39321655.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
deutschebahn/mnist | ---
license: unknown
---
|
selinerdem/german-orca | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt_en
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 18930958
num_examples: 10003
- name: test
num_bytes: 2085252
num_examples: 1123
download_size: 0
dataset_size: 21016210
---
# Dataset Card for "german-orca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat | ---
pretty_name: Evaluation run of JosephusCheung/Yee-34B-200K-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Yee-34B-200K-Chat](https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-05T04:15:54.776905](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat/blob/main/results_2023-12-05T04-15-54.776905.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7397087702526806,\n\
\ \"acc_stderr\": 0.028697152379174293,\n \"acc_norm\": 0.749145830773331,\n\
\ \"acc_norm_stderr\": 0.029232668522838182,\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.01698703926614299,\n \"mc2\": 0.538842608150276,\n\
\ \"mc2_stderr\": 0.015448158590971197\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893446,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156218\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6506671977693687,\n\
\ \"acc_stderr\": 0.0047578490234119605,\n \"acc_norm\": 0.8432583150766779,\n\
\ \"acc_norm_stderr\": 0.003628140427399768\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8301886792452831,\n \"acc_stderr\": 0.023108393799841326,\n\
\ \"acc_norm\": 0.8301886792452831,\n \"acc_norm_stderr\": 0.023108393799841326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n\
\ \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \
\ \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889774,\n\
\ \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889774\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6375661375661376,\n \"acc_stderr\": 0.024757473902752045,\n \"\
acc_norm\": 0.6375661375661376,\n \"acc_norm_stderr\": 0.024757473902752045\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8612903225806452,\n \"acc_stderr\": 0.019662961321414027,\n \"\
acc_norm\": 0.8612903225806452,\n \"acc_norm_stderr\": 0.019662961321414027\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n \"\
acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02548549837334323,\n\
\ \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02548549837334323\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047926,\n \"\
acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047926\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527046,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527046\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.0210206726808279,\n \
\ \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.0210206726808279\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660077,\n \
\ \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660077\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"\
acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n\
\ \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.026222235171477374,\n\
\ \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.026222235171477374\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455386,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455386\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562586,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562586\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8978288633461047,\n\
\ \"acc_stderr\": 0.010830724713134182,\n \"acc_norm\": 0.8978288633461047,\n\
\ \"acc_norm_stderr\": 0.010830724713134182\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7195530726256983,\n\
\ \"acc_stderr\": 0.015024083883322895,\n \"acc_norm\": 0.7195530726256983,\n\
\ \"acc_norm_stderr\": 0.015024083883322895\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n\
\ \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6170212765957447,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n\
\ \"acc_stderr\": 0.012689708167787679,\n \"acc_norm\": 0.5560625814863103,\n\
\ \"acc_norm_stderr\": 0.012689708167787679\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.02423101337054109,\n\
\ \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.02423101337054109\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108568,\n \
\ \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108568\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098615,\n\
\ \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098615\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.01698703926614299,\n \"mc2\": 0.538842608150276,\n\
\ \"mc2_stderr\": 0.015448158590971197\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.01128501375404745\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3479909021986353,\n \
\ \"acc_stderr\": 0.013120581030382132\n }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|arc:challenge|25_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|gsm8k|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hellaswag|10_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T04-15-54.776905.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- '**/details_harness|winogrande|5_2023-12-05T04-15-54.776905.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-05T04-15-54.776905.parquet'
- config_name: results
data_files:
- split: 2023_12_05T04_15_54.776905
path:
- results_2023-12-05T04-15-54.776905.parquet
- split: latest
path:
- results_2023-12-05T04-15-54.776905.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Yee-34B-200K-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Yee-34B-200K-Chat](https://huggingface.co/JosephusCheung/Yee-34B-200K-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T04:15:54.776905](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Yee-34B-200K-Chat/blob/main/results_2023-12-05T04-15-54.776905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7397087702526806,
"acc_stderr": 0.028697152379174293,
"acc_norm": 0.749145830773331,
"acc_norm_stderr": 0.029232668522838182,
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614299,
"mc2": 0.538842608150276,
"mc2_stderr": 0.015448158590971197
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893446,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156218
},
"harness|hellaswag|10": {
"acc": 0.6506671977693687,
"acc_stderr": 0.0047578490234119605,
"acc_norm": 0.8432583150766779,
"acc_norm_stderr": 0.003628140427399768
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8301886792452831,
"acc_stderr": 0.023108393799841326,
"acc_norm": 0.8301886792452831,
"acc_norm_stderr": 0.023108393799841326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7617021276595745,
"acc_stderr": 0.027851252973889774,
"acc_norm": 0.7617021276595745,
"acc_norm_stderr": 0.027851252973889774
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6375661375661376,
"acc_stderr": 0.024757473902752045,
"acc_norm": 0.6375661375661376,
"acc_norm_stderr": 0.024757473902752045
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8612903225806452,
"acc_stderr": 0.019662961321414027,
"acc_norm": 0.8612903225806452,
"acc_norm_stderr": 0.019662961321414027
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02548549837334323,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02548549837334323
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047926,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047926
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527046,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527046
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.0210206726808279,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.0210206726808279
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660077,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660077
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116245,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.01809424711647332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.01809424711647332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.026222235171477374,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.026222235171477374
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.02684576505455386,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.02684576505455386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.033932957297610096,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.033932957297610096
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562586,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562586
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8978288633461047,
"acc_stderr": 0.010830724713134182,
"acc_norm": 0.8978288633461047,
"acc_norm_stderr": 0.010830724713134182
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657528,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657528
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7195530726256983,
"acc_stderr": 0.015024083883322895,
"acc_norm": 0.7195530726256983,
"acc_norm_stderr": 0.015024083883322895
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5560625814863103,
"acc_stderr": 0.012689708167787679,
"acc_norm": 0.5560625814863103,
"acc_norm_stderr": 0.012689708167787679
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.02423101337054109,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.02423101337054109
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108568,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108568
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.023661699177098615,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.023661699177098615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614299,
"mc2": 0.538842608150276,
"mc2_stderr": 0.015448158590971197
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.01128501375404745
},
"harness|gsm8k|5": {
"acc": 0.3479909021986353,
"acc_stderr": 0.013120581030382132
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d025b660 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1320
dataset_size: 188
---
# Dataset Card for "d025b660"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hazyresearch/based-swde | ---
dataset_info:
features:
- name: doc_id
dtype: string
- name: file_name
dtype: string
- name: key
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: validation
num_bytes: 4651754
num_examples: 1111
download_size: 1824942
dataset_size: 4651754
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
task_categories:
- question-answering
- feature-extraction
--- |
Kolibri753/generate-workouts | ---
license: openrail
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 184018
num_examples: 101
download_size: 82785
dataset_size: 184018
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/clueweb12_b13_clef-ehealth_cs | ---
pretty_name: '`clueweb12/b13/clef-ehealth/cs`'
viewer: false
source_datasets: ['irds/clueweb12_b13']
task_categories:
- text-retrieval
---
# Dataset Card for `clueweb12/b13/clef-ehealth/cs`
The `clueweb12/b13/clef-ehealth/cs` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clueweb12#clueweb12/b13/clef-ehealth/cs).
# Data
This dataset provides:
- `queries` (i.e., topics); count=300
- `qrels`: (relevance assessments); count=269,232
- For `docs`, use [`irds/clueweb12_b13`](https://huggingface.co/datasets/irds/clueweb12_b13)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/clueweb12_b13_clef-ehealth_cs', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/clueweb12_b13_clef-ehealth_cs', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'trustworthiness': ..., 'understandability': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Zuccon2016ClefEhealth,
title={The IR Task at the CLEF eHealth Evaluation Lab 2016: User-centred Health Information Retrieval},
author={Guido Zuccon and Joao Palotti and Lorraine Goeuriot and Liadh Kelly and Mihai Lupu and Pavel Pecina and Henning M{\"u}ller and Julie Budaher and Anthony Deacon},
booktitle={CLEF},
year={2016}
}
@inproceedings{Palotti2017ClefEhealth,
title={CLEF 2017 Task Overview: The IR Task at the eHealth Evaluation Lab - Evaluating Retrieval Methods for Consumer Health Search},
author={Joao Palotti and Guido Zuccon and Jimmy and Pavel Pecina and Mihai Lupu and Lorraine Goeuriot and Liadh Kelly and Allan Hanbury},
booktitle={CLEF},
year={2017}
}
```
|
open-llm-leaderboard/details_andysalerno__rainbowfish-v7 | ---
pretty_name: Evaluation run of andysalerno/rainbowfish-v7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andysalerno/rainbowfish-v7](https://huggingface.co/andysalerno/rainbowfish-v7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__rainbowfish-v7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T19:51:13.716152](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-v7/blob/main/results_2024-02-09T19-51-13.716152.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6298768459917149,\n\
\ \"acc_stderr\": 0.03257497035953263,\n \"acc_norm\": 0.6356065924410188,\n\
\ \"acc_norm_stderr\": 0.033234895186529965,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.4977624814777941,\n\
\ \"mc2_stderr\": 0.01511189422251918\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.01443413871337998,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349812\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6328420633339972,\n\
\ \"acc_stderr\": 0.004810449343572396,\n \"acc_norm\": 0.8252340171280621,\n\
\ \"acc_norm_stderr\": 0.0037899067926446877\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200154,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n\
\ \"acc_stderr\": 0.015491756531894638,\n \"acc_norm\": 0.311731843575419,\n\
\ \"acc_norm_stderr\": 0.015491756531894638\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n\
\ \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n\
\ \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.4977624814777941,\n\
\ \"mc2_stderr\": 0.01511189422251918\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773239\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37452615617892343,\n \
\ \"acc_stderr\": 0.013331774158491388\n }\n}\n```"
repo_url: https://huggingface.co/andysalerno/rainbowfish-v7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|arc:challenge|25_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|gsm8k|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hellaswag|10_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-51-13.716152.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T19-51-13.716152.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- '**/details_harness|winogrande|5_2024-02-09T19-51-13.716152.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T19-51-13.716152.parquet'
- config_name: results
data_files:
- split: 2024_02_09T19_51_13.716152
path:
- results_2024-02-09T19-51-13.716152.parquet
- split: latest
path:
- results_2024-02-09T19-51-13.716152.parquet
---
# Dataset Card for Evaluation run of andysalerno/rainbowfish-v7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/rainbowfish-v7](https://huggingface.co/andysalerno/rainbowfish-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__rainbowfish-v7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T19:51:13.716152](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-v7/blob/main/results_2024-02-09T19-51-13.716152.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6298768459917149,
"acc_stderr": 0.03257497035953263,
"acc_norm": 0.6356065924410188,
"acc_norm_stderr": 0.033234895186529965,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698296,
"mc2": 0.4977624814777941,
"mc2_stderr": 0.01511189422251918
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.01443413871337998,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349812
},
"harness|hellaswag|10": {
"acc": 0.6328420633339972,
"acc_stderr": 0.004810449343572396,
"acc_norm": 0.8252340171280621,
"acc_norm_stderr": 0.0037899067926446877
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894638,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894638
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698296,
"mc2": 0.4977624814777941,
"mc2_stderr": 0.01511189422251918
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773239
},
"harness|gsm8k|5": {
"acc": 0.37452615617892343,
"acc_stderr": 0.013331774158491388
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
davidgaofc/Shadow_prompts | ---
license: mit
dataset_info:
features:
- name: Question
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 102611
num_examples: 1640
download_size: 40173
dataset_size: 102611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
halloyu84/llama2_finetune_ryu | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
facebook/PUG_Animals | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: image
dtype: image
- name: world_name
dtype: string
- name: character_name
dtype: string
- name: character_scale
dtype: float64
- name: camera_yaw
dtype: int64
- name: character_texture
dtype: string
splits:
- name: train
num_bytes: 82030062942.72
num_examples: 215040
download_size: 84628407574
dataset_size: 82030062942.72
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## PUG Animals
The PUG: Animals dataset contains 215,040 pre-rendered images based on Unreal-Engine using 70 animal assets, 64 environments, 3 sizes, 4 textures, under 4 camera orientations.
It was designed with the intent to create a dataset with variation factors available. Inspired by research on out-of-distribution generalization, PUG: Animals allows one to precisely control distribution shifts between training and testing which can provide better insight on how a deep neural network generalizes on held out variation factors.
## LICENSE
The datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.
## Citing PUG
If you use one of the PUG datasets, please cite:
```
@misc{bordes2023pug,
title={PUG: Photorealistic and Semantically Controllable Synthetic Data for Representation Learning},
author={Florian Bordes and Shashank Shekhar and Mark Ibrahim and Diane Bouchacourt and Pascal Vincent and Ari S. Morcos},
year={2023},
eprint={2308.03977},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## To learn more about the PUG datasets:
Please visit the [website](https://pug.metademolab.com/) and the [github](https://github.com/facebookresearch/PUG) |
qmeeus/smart-lights-en-close-field | ---
dataset_info:
features:
- name: uttid
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: intent
dtype:
class_label:
names:
'0': DecreaseBrightness
'1': IncreaseBrightness
'2': SetLightBrightness
'3': SetLightColor
'4': SwitchLightOff
'5': SwitchLightOn
- name: entities
sequence:
class_label:
names:
'0': B-LOC
'1': I-LOC
'2': B-COL
'3': I-COL
'4': B-NUM
'5': I-NUM
'6': O
- name: speaker
struct:
- name: age
dtype: int64
- name: country
dtype: string
- name: gender
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 124895101.58399998
num_examples: 1328
- name: validation
num_bytes: 15339937.9
num_examples: 166
- name: test
num_bytes: 15496384.9
num_examples: 166
download_size: 129906544
dataset_size: 155731424.38399997
---
# Dataset Card for "smart-lights-en-close-field"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bugdaryan/sql-create-context-instruction | ---
license: cc-by-4.0
task_categories:
- text-generation
- question-answering
- table-question-answering
language:
- en
tags:
- SQL
- code
- NLP
- text-to-sql
- context-sql
- spider
- wikisql
- sqlglot
pretty_name: sql-create-context
size_categories:
- 10K<n<100K
---
## Overview
This dataset is built upon [SQL Create Context](https://huggingface.co/datasets/b-mc2/sql-create-context), which in turn was constructed using data from [WikiSQL](https://huggingface.co/datasets/wikisql) and [Spider](https://huggingface.co/datasets/spider).
There are 78,577 examples of natural language queries, SQL CREATE TABLE statements, and SQL Query answering the question using the CREATE statement as context. This dataset was built with text-to-SQL LLMs in mind, intending to prevent hallucination of column and table names often seen when trained on text-to-SQL datasets. The CREATE TABLE statement can often be copy and pasted from different DBMS and provides table names, column names, and their data types. By providing just the CREATE TABLE statement as context, we can hopefully provide better grounding for models without having to provide actual rows of data, limiting token usage and exposure to private, sensitive, or proprietary data.
## Cleansing and Augmentation
Cleansing and data augmentation have been done on the combined WikiSQL and Spider data. [SQLGlot](https://github.com/tobymao/sqlglot) was used on queries from Spider and WikiSQL to parse them into different tables and columns. Column data types were inferred based on the usage of `>` `<` operators as well as the use of `MIN()`, `MAX()`, `AVG()`, and `SUM()` on columns. While this isn't perfect, it increases the likelihood of inferring the correct datatype for a column; otherwise, columns default to VARCHAR type. These tables and columns are then used to generate CREATE TABLE statements using the inferred types. SQLGlot is used again to ensure both the SQL queries and CREATE TABLE statements parse without errors.
Some queries that do not have column names, e.g., SELECT * FROM table, have a default Id column added to the CREATE TABLE statement. Some other queries which use the generic `table` as the FROM table have instead been changed to a variation of `table_name_1` or some other number which is also reflected in the CREATE TABLE statement.
## TODO
- Further augment the data by converting queries and CREATE TABLE statements into different SQL dialects. This can be done with SQLGlot. Reference to the dialect might also be added to the question.
**Random sample:**
```json
{
"text" : "[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: CREATE TABLE head (age INTEGER) Question: How many heads of the departments are older than 56 ? [/INST] Here is the SQLite query to answer to the question: How many heads of the departments are older than 56 ?: ``` SELECT COUNT(*) FROM head WHERE age > 56 ```",
"text" : "[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: CREATE TABLE head (name VARCHAR, born_state VARCHAR, age VARCHAR) Question: List the name, born state and age of the heads of departments ordered by age. [/INST] Here is the SQLite query to answer to the question: List the name, born state and age of the heads of departments ordered by age.: ``` SELECT name, born_state, age FROM head ORDER BY age ```"
}
```
The dataset was used to create code-llama-2 style prompts. The basic prompt template is:
```
[INST] Instruction/context [/INST]
Model output
``` |
tyzhu/fwv2_random_num_train_1000_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 195871
num_examples: 2100
- name: train_doc2id
num_bytes: 92393
num_examples: 1100
- name: train_id2doc
num_bytes: 95693
num_examples: 1100
- name: train_find_word
num_bytes: 100178
num_examples: 1000
- name: eval_find_word
num_bytes: 10146
num_examples: 100
- name: id_context_mapping
num_bytes: 60493
num_examples: 1100
download_size: 0
dataset_size: 554774
---
# Dataset Card for "fwv2_random_num_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chainyo/rvl-cdip-questionnaire | ---
license: other
---
⚠️ This only a subpart of the original dataset, containing only `questionnaire`.
The RVL-CDIP (Ryerson Vision Lab Complex Document Information Processing) dataset consists of 400,000 grayscale images in 16 classes, with 25,000 images per class. There are 320,000 training images, 40,000 validation images, and 40,000 test images. The images are sized so their largest dimension does not exceed 1000 pixels.
For questions and comments please contact Adam Harley (aharley@scs.ryerson.ca).
The full dataset can be found [here](https://www.cs.cmu.edu/~aharley/rvl-cdip/).
## Labels
0: letter
1: form
2: email
3: handwritten
4: advertissement
5: scientific report
6: scientific publication
7: specification
8: file folder
9: news article
10: budget
11: invoice
12: presentation
13: questionnaire
14: resume
15: memo
## Citation
This dataset is from this [paper](https://www.cs.cmu.edu/~aharley/icdar15/) `A. W. Harley, A. Ufkes, K. G. Derpanis, "Evaluation of Deep Convolutional Nets for Document Image Classification and Retrieval," in ICDAR, 2015`
## License
RVL-CDIP is a subset of IIT-CDIP, which came from the [Legacy Tobacco Document Library](https://www.industrydocuments.ucsf.edu/tobacco/), for which license information can be found [here](https://www.industrydocuments.ucsf.edu/help/copyright/).
## References
1. D. Lewis, G. Agam, S. Argamon, O. Frieder, D. Grossman, and J. Heard, "Building a test collection for complex document information processing," in Proc. 29th Annual Int. ACM SIGIR Conference (SIGIR 2006), pp. 665-666, 2006
2. The Legacy Tobacco Document Library (LTDL), University of California, San Francisco, 2007. http://legacy.library.ucsf.edu/. |
FunDialogues/healthcare-minor-consultation | ---
license: apache-2.0
task_categories:
- question-answering
- conversational
language:
- en
tags:
- fictitious dialogues
- prototyping
- healthcare
pretty_name: 'healthcare-minor-consultation'
size_categories:
- n<1K
---
# This Dialogue
Comprised of fictitious examples of dialogues between a doctor and a patient during a minor medical consultation.. Check out the example below:
```
"id": 1,
"description": "Discussion about a common cold",
"dialogue": "Patient: Doctor, I've been feeling congested and have a runny nose. What can I do to relieve these symptoms?\n\nDoctor: It sounds like you have a common cold. You can try over-the-counter decongestants to relieve congestion and saline nasal sprays to help with the runny nose. Make sure to drink plenty of fluids and get enough rest as well."
```
# How to Load Dialogues
Loading dialogues can be accomplished using the fun dialogues library or Hugging Face datasets library.
## Load using fun dialogues
1. Install fun dialogues package
`pip install fundialogues`
2. Use loader utility to load dataset as pandas dataframe. Further processing might be required for use.
```
from fundialogues import dialoader
# load as pandas dataframe
bball_coach = dialoader("FunDialogues/healthcare-minor-consultation")
```
## Loading using Hugging Face datasets
1. Install datasets package
2. Load using datasets
```
from datasets import load_dataset
dataset = load_dataset("FunDialogues/healthcare-minor-consultation")
```
## How to Contribute
If you want to contribute to this project and make it better, your help is very welcome. Contributing is also a great way to learn more about social coding on Github, new technologies and and their ecosystems and how to make constructive, helpful bug reports, feature requests and the noblest of all contributions: a good, clean pull request.
### Contributing your own Lifecycle Solution
If you want to contribute to an existing dialogue or add a new dialogue, please open an issue and I will follow up with you ASAP!
### Implementing Patches and Bug Fixes
- Create a personal fork of the project on Github.
- Clone the fork on your local machine. Your remote repo on Github is called origin.
- Add the original repository as a remote called upstream.
- If you created your fork a while ago be sure to pull upstream changes into your local repository.
- Create a new branch to work on! Branch from develop if it exists, else from master.
- Implement/fix your feature, comment your code.
- Follow the code style of the project, including indentation.
- If the component has tests run them!
- Write or adapt tests as needed.
- Add or change the documentation as needed.
- Squash your commits into a single commit with git's interactive rebase. Create a new branch if necessary.
- Push your branch to your fork on Github, the remote origin.
- From your fork open a pull request in the correct branch. Target the project's develop branch if there is one, else go for master!
If the maintainer requests further changes just push them to your branch. The PR will be updated automatically.
Once the pull request is approved and merged you can pull the changes from upstream to your local repo and delete your extra branch(es).
And last but not least: Always write your commit messages in the present tense. Your commit message should describe what the commit, when applied, does to the code – not what you did to the code.
# Disclaimer
The dialogues contained in this repository are provided for experimental purposes only. It is important to note that these dialogues are assumed to be original work by a human and are entirely fictitious, despite the possibility of some examples including factually correct information. The primary intention behind these dialogues is to serve as a tool for language modeling experimentation and should not be used for designing real-world products beyond non-production prototyping.
Please be aware that the utilization of fictitious data in these datasets may increase the likelihood of language model artifacts, such as hallucinations or unrealistic responses. Therefore, it is essential to exercise caution and discretion when employing these datasets for any purpose.
It is crucial to emphasize that none of the scenarios described in the fun dialogues dataset should be relied upon to provide advice or guidance to humans. These scenarios are purely fictitious and are intended solely for demonstration purposes. Any resemblance to real-world situations or individuals is entirely coincidental.
The responsibility for the usage and application of these datasets rests solely with the individual or entity employing them. By accessing and utilizing these dialogues and all contents of the repository, you acknowledge that you have read and understood this disclaimer, and you agree to use them at your own discretion and risk. |
joseluhf11/oct-fovea-detection | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
splits:
- name: train
num_bytes: 350015166.0
num_examples: 431
download_size: 349205446
dataset_size: 350015166.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Maximofn/opus100 | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- en
- es
splits:
- name: test
num_bytes: 326262
num_examples: 2000
- name: train
num_bytes: 136643104
num_examples: 1000000
- name: validation
num_bytes: 326727
num_examples: 2000
download_size: 100103904
dataset_size: 137296093
---
|
Back-up/toxicContenData | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
- name: update
dtype: int64
splits:
- name: train
num_bytes: 174657
num_examples: 626
download_size: 93236
dataset_size: 174657
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "toxicContenData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gutenberg_time | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
paperswithcode_id: gutenberg-time-dataset
pretty_name: the Gutenberg Time dataset
dataset_info:
features:
- name: guten_id
dtype: string
- name: hour_reference
dtype: string
- name: time_phrase
dtype: string
- name: is_ambiguous
dtype: bool_
- name: time_pos_start
dtype: int64
- name: time_pos_end
dtype: int64
- name: tok_context
dtype: string
config_name: gutenberg
splits:
- name: train
num_bytes: 108550391
num_examples: 120694
download_size: 35853781
dataset_size: 108550391
---
# Dataset Card for the Gutenberg Time dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Repository](https://github.com/allenkim/what-time-is-it)**
- **[Paper](https://arxiv.org/abs/2011.04124)**
### Dataset Summary
A clean data resource containing all explicit time references in a dataset of 52,183 novels whose full text is available via Project Gutenberg.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Time-of-the-day classification from excerpts.
## Dataset Structure
### Data Instances
```
{
"guten_id": 28999,
"hour_reference": 12,
"time_phrase": "midday",
"is_ambiguous": False,
"time_pos_start": 133,
"time_pos_end": 134,
"tok_context": "Sorrows and trials she had had in plenty in her life , but these the sweetness of her nature had transformed , so that from being things difficult to bear , she had built up with them her own character . Sorrow had increased her own power of sympathy ; out of trials she had learnt patience ; and failure and the gradual sinking of one she had loved into the bottomless slough of evil habit had but left her with an added dower of pity and tolerance . So the past had no sting left , and if iron had ever entered into her soul it now but served to make it strong . She was still young , too ; it was not near sunset with her yet , nor even midday , and the future that , humanly speaking , she counted to be hers was almost dazzling in its brightness . For love had dawned for her again , and no uncertain love , wrapped in the mists of memory , but one that had ripened through liking and friendship and intimacy into the authentic glory . He was in England , too ; she was going back to him . And before very long she would never go away from him again ."
}
```
### Data Fields
```
guten_id - Gutenberg ID number
hour_reference - hour from 0 to 23
time_phrase - the phrase corresponding to the referenced hour
is_ambiguous - boolean whether it is clear whether time is AM or PM
time_pos_start - token position where time_phrase begins
time_pos_end - token position where time_phrase ends (exclusive)
tok_context - context in which time_phrase appears as space-separated tokens
```
### Data Splits
No data splits.
## Dataset Creation
### Curation Rationale
The flow of time is an indispensable guide for our actions, and provides a framework in which to see a logical progression of events. Just as in real life,the clock provides the background against which literary works play out: when characters wake, eat,and act. In most works of fiction, the events of the story take place during recognizable time periods over the course of the day. Recognizing a story’s flow through time is essential to understanding the text.In this paper, we try to capture the flow of time through novels by attempting to recognize what time of day each event in the story takes place at.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Novel authors.
### Annotations
#### Annotation process
Manually annotated.
#### Who are the annotators?
Two of the authors.
### Personal and Sensitive Information
No Personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Allen Kim, Charuta Pethe and Steven Skiena, Stony Brook University
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{kim2020time,
title={What time is it? Temporal Analysis of Novels},
author={Allen Kim and Charuta Pethe and Steven Skiena},
year={2020},
eprint={2011.04124},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@TevenLeScao](https://github.com/TevenLeScao) for adding this dataset. |
Helsinki-NLP/opus_rf | ---
annotations_creators:
- found
language_creators:
- expert-generated
language:
- de
- en
- es
- fr
- sv
license:
- unknown
multilinguality:
- multilingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- translation
task_ids: []
pretty_name: OpusRf
config_names:
- de-en
- de-es
- de-fr
- de-sv
- en-es
- en-fr
- en-sv
- es-fr
- es-sv
- fr-sv
dataset_info:
- config_name: de-en
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- en
splits:
- name: train
num_bytes: 38671
num_examples: 177
download_size: 25572
dataset_size: 38671
- config_name: de-es
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- es
splits:
- name: train
num_bytes: 2304
num_examples: 24
download_size: 3690
dataset_size: 2304
- config_name: de-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- fr
splits:
- name: train
num_bytes: 41288
num_examples: 173
download_size: 26724
dataset_size: 41288
- config_name: de-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- sv
splits:
- name: train
num_bytes: 37402
num_examples: 178
download_size: 25101
dataset_size: 37402
- config_name: en-es
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- es
splits:
- name: train
num_bytes: 2588
num_examples: 25
download_size: 3865
dataset_size: 2588
- config_name: en-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 39491
num_examples: 175
download_size: 25966
dataset_size: 39491
- config_name: en-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- sv
splits:
- name: train
num_bytes: 35766
num_examples: 180
download_size: 24513
dataset_size: 35766
- config_name: es-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- fr
splits:
- name: train
num_bytes: 2507
num_examples: 21
download_size: 3789
dataset_size: 2507
- config_name: es-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- sv
splits:
- name: train
num_bytes: 3098
num_examples: 28
download_size: 4227
dataset_size: 3098
- config_name: fr-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- sv
splits:
- name: train
num_bytes: 38615
num_examples: 175
download_size: 25822
dataset_size: 38615
configs:
- config_name: de-en
data_files:
- split: train
path: de-en/train-*
- config_name: de-es
data_files:
- split: train
path: de-es/train-*
- config_name: de-fr
data_files:
- split: train
path: de-fr/train-*
- config_name: de-sv
data_files:
- split: train
path: de-sv/train-*
- config_name: en-es
data_files:
- split: train
path: en-es/train-*
- config_name: en-fr
data_files:
- split: train
path: en-fr/train-*
- config_name: en-sv
data_files:
- split: train
path: en-sv/train-*
- config_name: es-fr
data_files:
- split: train
path: es-fr/train-*
- config_name: es-sv
data_files:
- split: train
path: es-sv/train-*
- config_name: fr-sv
data_files:
- split: train
path: fr-sv/train-*
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://opus.nlpl.eu/RF.php
- **Repository:**
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper.pdf
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
RF is a tiny parallel corpus of the Declarations of the Swedish Government and its translations.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English (en), Spanish (es), German (de), French (fr), Swedish (sv)
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@InProceedings{TIEDEMANN12.463,
author = {J{\"o}rg Tiedemann},
title = {Parallel Data, Tools and Interfaces in OPUS},
booktitle = {Proceedings of the Eight International Conference on Language Resources and Evaluation (LREC'12)},
year = {2012},
month = {may},
date = {23-25},
address = {Istanbul, Turkey},
editor = {Nicoletta Calzolari (Conference Chair) and Khalid Choukri and Thierry Declerck and Mehmet Ugur Dogan and Bente Maegaard and Joseph Mariani and Jan Odijk and Stelios Piperidis},
publisher = {European Language Resources Association (ELRA)},
isbn = {978-2-9517408-7-7},
language = {english}
}
```
### Contributions
Thanks to [@akshayb7](https://github.com/akshayb7) for adding this dataset. |
ignacioct/math_topics | ---
dataset_info:
features:
- name: seeds
dtype: string
splits:
- name: train
num_bytes: 153
num_examples: 10
download_size: 944
dataset_size: 153
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.