datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ajibawa-2023/Code-290k-ShareGPT | ---
license: apache-2.0
task_categories:
- conversational
- text-generation
language:
- en
tags:
- code
size_categories:
- 100K<n<1M
---
**Code-290k-ShareGPT**
This dataset is in Vicuna/ShareGPT format. There are around 290000 set of conversations. Each set having 2 conversations.
Along with Python, Java, JavaScript, GO, C++, Rust, Ruby, Sql, MySql, R, Julia, Haskell, etc. code with detailed explanation are provided.
This datset is built upon using my existing Datasets [Python-Code-23k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Python-Code-23k-ShareGPT)
and [Code-74k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Code-74k-ShareGPT)
My Models [Python-Code-13B](https://huggingface.co/ajibawa-2023/Python-Code-13B) and [Python-Code-33B](https://huggingface.co/ajibawa-2023/Python-Code-33B) are trained on [Python-Code-23k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Python-Code-23k-ShareGPT).
My Models [Code-13B](https://huggingface.co/ajibawa-2023/Code-13B) and [Code-33B](https://huggingface.co/ajibawa-2023/Code-33B) are trained on [Code-74k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Code-74k-ShareGPT).
I am building few models using **Code-290k-ShareGPT** dataset. |
radhakrishnanrajan/guanaco-llama2-1k-rk | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/etou_misaki_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of etou_misaki/衛藤美紗希 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of etou_misaki/衛藤美紗希 (THE iDOLM@STER: Cinderella Girls), containing 41 images and their tags.
The core tags of this character are `brown_hair, long_hair, green_eyes, earrings, breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 33.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etou_misaki_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 41 | 27.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etou_misaki_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 79 | 47.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etou_misaki_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 41 | 37.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etou_misaki_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 79 | 57.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/etou_misaki_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/etou_misaki_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------|
| 0 | 41 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, bracelet, character_name, cleavage, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | bracelet | character_name | cleavage | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-----------|:-----------------|:-----------|:-------------|
| 0 | 41 |  |  |  |  |  | X | X | X | X | X | X | X | X |
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_67 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1217972428.0
num_examples: 237329
download_size: 1245824053
dataset_size: 1217972428.0
---
# Dataset Card for "chunk_67"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AppleHarem/beagle_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of beagle (Arknights)
This is the dataset of beagle (Arknights), containing 35 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 35 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 77 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 85 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 35 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 35 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 35 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 77 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 77 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 45 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 85 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 85 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
nateraw/quickdraw | ---
license: cc-by-4.0
---
|
aai520-group6/squad_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 116732025
num_examples: 130319
- name: validation
num_bytes: 11661091
num_examples: 11873
download_size: 0
dataset_size: 128393116
---
# Dataset Card for "squad_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wayne0019/autotrain-data-lwf-summarization | ---
language:
- zh
task_categories:
- summarization
---
# AutoTrain Dataset for project: lwf-summarization
## Dataset Description
This dataset has been automatically processed by AutoTrain for project lwf-summarization.
### Languages
The BCP-47 code for the dataset's language is zh.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_id": "13716782",
"target": "The scariest place for Jessica was the Capuchin Catacombs in Palermo.",
"text": "Kelly: Oh! Oh! Can I pick the first question?\r\nJessica: Sure. Go for it!\r\nKelly: What's the scariest place you've been to!\r\nJessica: I'll start: Palermo in Italy.\r\nMickey: And what's so scary about that? Did you break your nail? :P\r\nJessica: Shut it, Mickey! No, there are the Capuchin Catacombs with 8000 corpses! \r\nKelly: Ewwww! Corpses? Rly?\r\nJessica: Yeah! And you can look at them like museum exhibits. I think they're divided somehow, but have no clue how!\r\nOllie: That's so cool! Do you get to see the bones or are they covered up?\r\nJessica: Well, partly. Most of them were exhibited in their clothes. Basically only skulls and hands. \r\nMickey: I'm writing this one down! That's so precious!\r\nOllie: Me too!"
},
{
"feat_id": "13716592",
"target": "Carrie and Gina saw \"Fantastic Beast\" and liked it. Ginna loved Eddie Redmayne as Newt. ",
"text": "Carrie: Just back from Fantastic Beast :)\r\nGina: and what do you think?\r\nCarrie: generally good - as usual nice special effect and visuals, an ok plot, a glimpse of the wizarding community in the US.\r\nAlex: Sounds cool. I was thinking of going this weekend with Lane, but I've seen some bad reviews.\r\nCarrie: Depends on what you expect really - I have a lot of sentiment towards Harry Potter so, I'm gonna like everything the do. But seriously the movie was decent. However, if you're expecting to have your mind blown, then no, it's not THAT good.\r\nGina: I agree. I saw it last week and basically I'm satisfied.\r\nAlex: No spoilers, girls.\r\nCarrie: no worries ;)\r\nCarrie: And Gina, what do you think about Eddie Redmayne as Newt?\r\nGina: I loved him <3 I loved how introverted and awkward he was and how caring he was towards the animals. And with all that he showed a lot of confidence in his beliefs and was a genuinely compassionate character\r\nCarrie: not your standard protagonist, that's for sure\r\nGina: and that's what I liked about him\r\nAlex: Maybe I'll go and see it sooner so we can all talk about it.\r\nCarrie: go see it. If' you're not expecting god-knows-what you're going to enjoy it ;)"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_id": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 655 |
| valid | 164 |
|
choudhry2272/paf | ---
license: apache-2.0
---
|
priyam314/NST | ---
language:
- en
pretty_name: NST-Intermediate
size_categories:
- 1K<n<10K
--- |
jlbaker361/kaggle_females_dim_128_0.1k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 2258285.0
num_examples: 100
download_size: 2256434
dataset_size: 2258285.0
---
# Dataset Card for "kaggle_females_dim_128_0.1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ericyu/EGY_BCD | ---
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 685483069.3837136
num_examples: 3654
- name: test
num_bytes: 226848178.30523786
num_examples: 1218
- name: val
num_bytes: 228364798.69204846
num_examples: 1219
download_size: 1135172308
dataset_size: 1140696046.381
---
# Dataset Card for "EGY_BCD"
This is an unofficial repo for the change detection dataset EGY-BCD. The dataset was randomly (seed=8888) split into subsets of train/val/test with a ratio of 6:2:2.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangshuoming/c_x86_O0_exebench_numeric_1k_json_cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2427004.944
num_examples: 507
download_size: 190990
dataset_size: 2427004.944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c_x86_O0_exebench_numeric_1k_json_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-cnn_dailymail-d1c2a643-13015772 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/bigbird-pegasus-large-arxiv
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-arxiv
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@grapplerulrich](https://huggingface.co/grapplerulrich) for evaluating this model. |
Riksarkivet/mini_cleaned_diachronic_swe | ---
dataset_info:
features:
- name: chunked_text
dtype: string
splits:
- name: test
num_bytes: 14891546.825140134
num_examples: 8410
- name: train
num_bytes: 729669858.1748599
num_examples: 412081
download_size: 480496204
dataset_size: 744561405.0
license: mit
language:
- sv
tags:
- historical
- WIP
pretty_name: Kbuhist2
size_categories:
- 1M<n<10M
---
# Dataset Card for mini_cleaned_diachronic_swe
The Swedish Diachronic Corpus is a project funded by [Swe-Clarin](https://sweclarin.se/eng) and provides a corpus of texts covering the time period from Old Swedish.
The dataset has been preprocessed and can be recreated from here: [Src_code](https://github.com/Borg93/kbuhist2/tree/main).
## Dataset Summary
The dataset has been filtered with the metadata:
- Manueally transcribed or post-ocr correction
- No scrambled sentences
- Year of origin: 15-19th centuary
### Data Splits
**This will be further extended!**
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 352137 |
| Test | 7187 |
## Acknowledgements
We gratefully acknowledge [SWE-clarin](https://sweclarin.se/) for the datasets.
## Citation Information
Eva Pettersson and Lars Borin (2022)
Swedish Diachronic Corpus
In Darja Fišer & Andreas Witt (eds.), CLARIN. The Infrastructure for Language Resources. Berlin: deGruyter. https://degruyter.com/document/doi/10.1515/9783110767377-022/html |
bjoernp/code_search_net_python_processed_400k | ---
dataset_info:
features:
- name: code
dtype: string
- name: signature
dtype: string
- name: docstring
dtype: string
- name: loss_without_docstring
dtype: float64
- name: loss_with_docstring
dtype: float64
- name: factor
dtype: float64
splits:
- name: train
num_bytes: 373144422
num_examples: 400244
download_size: 150980039
dataset_size: 373144422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_search_net_python_processed_400k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_17_10000000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 193027
num_examples: 6699
download_size: 123620
dataset_size: 193027
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_17_10000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_51 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1238822496
num_examples: 243288
download_size: 1265551736
dataset_size: 1238822496
---
# Dataset Card for "chunk_51"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LiveEvil/Teshjsdf | ---
license: mit
---
|
surabhiMV/qrcode_val_new_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1623749.0
num_examples: 41
download_size: 1563056
dataset_size: 1623749.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qrcode_val_new_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
c-s-ale/alpaca-gpt4-data-zh | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 32150579
num_examples: 48818
download_size: 35100559
dataset_size: 32150579
license: cc-by-4.0
language:
- zh
pretty_name: Instruction Tuning with GPT-4
size_categories:
- 10K<n<100K
task_categories:
- text-generation
tags:
- gpt
- alpaca
- fine-tune
- instruct-tune
- instruction
---
# Dataset Description
- **Project Page:** https://instruction-tuning-with-gpt-4.github.io
- **Repo:** https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM
- **Paper:** https://arxiv.org/abs/2304.03277
# Dataset Card for "alpaca-gpt4-data-zh"
All of the work is done by [this team](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM).
# Usage and License Notices
The data is intended and licensed for research use only. The dataset is CC BY NC 4.0 (allowing only non-commercial use) and models trained using the dataset should not be used outside of research purposes.
# English Dataset
[Found here](https://huggingface.co/datasets/c-s-ale/alpaca-gpt4-data)
# Citation
```
@article{peng2023gpt4llm,
title={Instruction Tuning with GPT-4},
author={Baolin Peng, Chunyuan Li, Pengcheng He, Michel Galley, Jianfeng Gao},
journal={arXiv preprint arXiv:2304.03277},
year={2023}
}
``` |
open-llm-leaderboard/details_xformAI__opt-125m-gqa-ub-6-best-for-KV-cache | ---
pretty_name: Evaluation run of xformAI/opt-125m-gqa-ub-6-best-for-KV-cache
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xformAI/opt-125m-gqa-ub-6-best-for-KV-cache](https://huggingface.co/xformAI/opt-125m-gqa-ub-6-best-for-KV-cache)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xformAI__opt-125m-gqa-ub-6-best-for-KV-cache\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T12:11:33.435491](https://huggingface.co/datasets/open-llm-leaderboard/details_xformAI__opt-125m-gqa-ub-6-best-for-KV-cache/blob/main/results_2024-01-23T12-11-33.435491.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23214395574495633,\n\
\ \"acc_stderr\": 0.029929161673252165,\n \"acc_norm\": 0.23167592940331871,\n\
\ \"acc_norm_stderr\": 0.030715935929569317,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080515,\n \"mc2\": 0.4953131184469278,\n\
\ \"mc2_stderr\": 0.016004347037417377\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448069,\n\
\ \"acc_norm\": 0.24232081911262798,\n \"acc_norm_stderr\": 0.012521593295800118\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2590121489743079,\n\
\ \"acc_stderr\": 0.004371969542814558,\n \"acc_norm\": 0.24995020912168892,\n\
\ \"acc_norm_stderr\": 0.004320990543283153\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080515,\n\
\ \"mc2\": 0.4953131184469278,\n \"mc2_stderr\": 0.016004347037417377\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5169692186266772,\n\
\ \"acc_stderr\": 0.014044390401612974\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/xformAI/opt-125m-gqa-ub-6-best-for-KV-cache
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|arc:challenge|25_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|arc:challenge|25_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|gsm8k|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|gsm8k|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hellaswag|10_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hellaswag|10_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T12-06-15.262886.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T12-11-33.435491.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T12-11-33.435491.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- '**/details_harness|winogrande|5_2024-01-23T12-06-15.262886.parquet'
- split: 2024_01_23T12_11_33.435491
path:
- '**/details_harness|winogrande|5_2024-01-23T12-11-33.435491.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T12-11-33.435491.parquet'
- config_name: results
data_files:
- split: 2024_01_23T12_06_15.262886
path:
- results_2024-01-23T12-06-15.262886.parquet
- split: 2024_01_23T12_11_33.435491
path:
- results_2024-01-23T12-11-33.435491.parquet
- split: latest
path:
- results_2024-01-23T12-11-33.435491.parquet
---
# Dataset Card for Evaluation run of xformAI/opt-125m-gqa-ub-6-best-for-KV-cache
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xformAI/opt-125m-gqa-ub-6-best-for-KV-cache](https://huggingface.co/xformAI/opt-125m-gqa-ub-6-best-for-KV-cache) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xformAI__opt-125m-gqa-ub-6-best-for-KV-cache",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T12:11:33.435491](https://huggingface.co/datasets/open-llm-leaderboard/details_xformAI__opt-125m-gqa-ub-6-best-for-KV-cache/blob/main/results_2024-01-23T12-11-33.435491.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23214395574495633,
"acc_stderr": 0.029929161673252165,
"acc_norm": 0.23167592940331871,
"acc_norm_stderr": 0.030715935929569317,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080515,
"mc2": 0.4953131184469278,
"mc2_stderr": 0.016004347037417377
},
"harness|arc:challenge|25": {
"acc": 0.20819112627986347,
"acc_stderr": 0.011864866118448069,
"acc_norm": 0.24232081911262798,
"acc_norm_stderr": 0.012521593295800118
},
"harness|hellaswag|10": {
"acc": 0.2590121489743079,
"acc_stderr": 0.004371969542814558,
"acc_norm": 0.24995020912168892,
"acc_norm_stderr": 0.004320990543283153
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080515,
"mc2": 0.4953131184469278,
"mc2_stderr": 0.016004347037417377
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612974
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ovior/twitter_dataset_1713048428 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2337246
num_examples: 7218
download_size: 1316553
dataset_size: 2337246
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alexthomas4/highsub-detection | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: string
- name: image_id
dtype: int64
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: area
sequence: int64
- name: bbox
sequence:
sequence: int64
- name: category
sequence: string
- name: category_id
sequence: int64
- name: id
sequence: string
splits:
- name: train
num_bytes: 1010292103.0
num_examples: 695
download_size: 1010305730
dataset_size: 1010292103.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_81_1713043111 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2476864
num_examples: 6190
download_size: 1288723
dataset_size: 2476864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Berzerker/incidental_scene_ocr_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: output_json_dumpsed
dtype: string
configs:
- config_name: default
data_files:
- split: train
path: data/*.parquet
language:
- en
---
|
llm-aes/pandalm-gemini-annotated | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: annotator
dtype: string
- name: preference
dtype: int64
- name: price_per_example
dtype: float64
- name: time_per_example
dtype: float64
- name: raw_completion
dtype: string
splits:
- name: train
num_bytes: 2723366
num_examples: 3600
download_size: 530841
dataset_size: 2723366
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_AA051611__V0202 | ---
pretty_name: Evaluation run of AA051611/V0202
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051611/V0202](https://huggingface.co/AA051611/V0202) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__V0202\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T21:33:44.363250](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0202/blob/main/results_2024-02-03T21-33-44.363250.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8475886247356212,\n\
\ \"acc_stderr\": 0.023609522686145943,\n \"acc_norm\": 0.8592262318029122,\n\
\ \"acc_norm_stderr\": 0.023958294301700357,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965824,\n \"mc2\": 0.5088923290302036,\n\
\ \"mc2_stderr\": 0.015447986277853607\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.0140702655192688,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441375\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n\
\ \"acc_stderr\": 0.004832934529120793,\n \"acc_norm\": 0.8275243975303724,\n\
\ \"acc_norm_stderr\": 0.003770211859118937\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8074074074074075,\n\
\ \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.8074074074074075,\n\
\ \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9407894736842105,\n \"acc_stderr\": 0.01920689719680031,\n\
\ \"acc_norm\": 0.9407894736842105,\n \"acc_norm_stderr\": 0.01920689719680031\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.86,\n\
\ \"acc_stderr\": 0.03487350880197772,\n \"acc_norm\": 0.86,\n \
\ \"acc_norm_stderr\": 0.03487350880197772\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.879245283018868,\n \"acc_stderr\": 0.020054189400972373,\n\
\ \"acc_norm\": 0.879245283018868,\n \"acc_norm_stderr\": 0.020054189400972373\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9583333333333334,\n\
\ \"acc_stderr\": 0.01671031580295999,\n \"acc_norm\": 0.9583333333333334,\n\
\ \"acc_norm_stderr\": 0.01671031580295999\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.838150289017341,\n\
\ \"acc_stderr\": 0.028083594279575755,\n \"acc_norm\": 0.838150289017341,\n\
\ \"acc_norm_stderr\": 0.028083594279575755\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n\
\ \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8893617021276595,\n \"acc_stderr\": 0.020506145099008426,\n\
\ \"acc_norm\": 0.8893617021276595,\n \"acc_norm_stderr\": 0.020506145099008426\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7982456140350878,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.7982456140350878,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8413793103448276,\n \"acc_stderr\": 0.030443500317583975,\n\
\ \"acc_norm\": 0.8413793103448276,\n \"acc_norm_stderr\": 0.030443500317583975\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.8518518518518519,\n \"acc_stderr\": 0.018296139984289767,\n \"\
acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.018296139984289767\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6428571428571429,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.6428571428571429,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9548387096774194,\n \"acc_stderr\": 0.01181323762156236,\n \"\
acc_norm\": 0.9548387096774194,\n \"acc_norm_stderr\": 0.01181323762156236\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.7881773399014779,\n \"acc_stderr\": 0.028748983689941072,\n \"\
acc_norm\": 0.7881773399014779,\n \"acc_norm_stderr\": 0.028748983689941072\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646612,\n \"acc_norm\"\
: 0.91,\n \"acc_norm_stderr\": 0.02876234912646612\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9333333333333333,\n \"acc_stderr\": 0.019478290326359282,\n\
\ \"acc_norm\": 0.9333333333333333,\n \"acc_norm_stderr\": 0.019478290326359282\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9646464646464646,\n \"acc_stderr\": 0.013157318878046073,\n \"\
acc_norm\": 0.9646464646464646,\n \"acc_norm_stderr\": 0.013157318878046073\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909013,\n\
\ \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909013\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.014491348171728305,\n\
\ \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.014491348171728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.7666666666666667,\n \"acc_stderr\": 0.025787874220959302,\n \
\ \"acc_norm\": 0.7666666666666667,\n \"acc_norm_stderr\": 0.025787874220959302\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8949579831932774,\n \"acc_stderr\": 0.019916300758805225,\n\
\ \"acc_norm\": 0.8949579831932774,\n \"acc_norm_stderr\": 0.019916300758805225\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.6887417218543046,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.6887417218543046,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9467889908256881,\n \"acc_stderr\": 0.009623385815462397,\n \"\
acc_norm\": 0.9467889908256881,\n \"acc_norm_stderr\": 0.009623385815462397\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.8148148148148148,\n \"acc_stderr\": 0.026491914727355164,\n \"\
acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.026491914727355164\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9558823529411765,\n \"acc_stderr\": 0.014413198705704825,\n \"\
acc_norm\": 0.9558823529411765,\n \"acc_norm_stderr\": 0.014413198705704825\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9409282700421941,\n \"acc_stderr\": 0.015346597463888693,\n \
\ \"acc_norm\": 0.9409282700421941,\n \"acc_norm_stderr\": 0.015346597463888693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.9147982062780269,\n\
\ \"acc_stderr\": 0.01873745202573731,\n \"acc_norm\": 0.9147982062780269,\n\
\ \"acc_norm_stderr\": 0.01873745202573731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9236641221374046,\n \"acc_stderr\": 0.02328893953617375,\n\
\ \"acc_norm\": 0.9236641221374046,\n \"acc_norm_stderr\": 0.02328893953617375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9338842975206612,\n \"acc_stderr\": 0.02268340369172331,\n \"\
acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.02268340369172331\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9629629629629629,\n\
\ \"acc_stderr\": 0.018257067489429676,\n \"acc_norm\": 0.9629629629629629,\n\
\ \"acc_norm_stderr\": 0.018257067489429676\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9141104294478528,\n \"acc_stderr\": 0.022014662933817535,\n\
\ \"acc_norm\": 0.9141104294478528,\n \"acc_norm_stderr\": 0.022014662933817535\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.8482142857142857,\n\
\ \"acc_stderr\": 0.03405702838185695,\n \"acc_norm\": 0.8482142857142857,\n\
\ \"acc_norm_stderr\": 0.03405702838185695\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.02650144078476276,\n\
\ \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.02650144078476276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9572649572649573,\n\
\ \"acc_stderr\": 0.013250436685245011,\n \"acc_norm\": 0.9572649572649573,\n\
\ \"acc_norm_stderr\": 0.013250436685245011\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9527458492975734,\n\
\ \"acc_stderr\": 0.007587612392626577,\n \"acc_norm\": 0.9527458492975734,\n\
\ \"acc_norm_stderr\": 0.007587612392626577\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8554913294797688,\n \"acc_stderr\": 0.018929764513468728,\n\
\ \"acc_norm\": 0.8554913294797688,\n \"acc_norm_stderr\": 0.018929764513468728\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8804469273743016,\n\
\ \"acc_stderr\": 0.010850836082151255,\n \"acc_norm\": 0.8804469273743016,\n\
\ \"acc_norm_stderr\": 0.010850836082151255\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.9052287581699346,\n \"acc_stderr\": 0.01677133127183646,\n\
\ \"acc_norm\": 0.9052287581699346,\n \"acc_norm_stderr\": 0.01677133127183646\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8938906752411575,\n\
\ \"acc_stderr\": 0.017491946161301987,\n \"acc_norm\": 0.8938906752411575,\n\
\ \"acc_norm_stderr\": 0.017491946161301987\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.9104938271604939,\n \"acc_stderr\": 0.01588414107393756,\n\
\ \"acc_norm\": 0.9104938271604939,\n \"acc_norm_stderr\": 0.01588414107393756\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.75177304964539,\n \"acc_stderr\": 0.0257700156442904,\n \"\
acc_norm\": 0.75177304964539,\n \"acc_norm_stderr\": 0.0257700156442904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8292046936114733,\n\
\ \"acc_stderr\": 0.009611645934807811,\n \"acc_norm\": 0.8292046936114733,\n\
\ \"acc_norm_stderr\": 0.009611645934807811\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01722970778103902,\n\
\ \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01722970778103902\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8872549019607843,\n \"acc_stderr\": 0.012795357747288056,\n \
\ \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.012795357747288056\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8454545454545455,\n\
\ \"acc_stderr\": 0.03462262571262667,\n \"acc_norm\": 0.8454545454545455,\n\
\ \"acc_norm_stderr\": 0.03462262571262667\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8775510204081632,\n \"acc_stderr\": 0.020985477705882164,\n\
\ \"acc_norm\": 0.8775510204081632,\n \"acc_norm_stderr\": 0.020985477705882164\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.945273631840796,\n\
\ \"acc_stderr\": 0.016082815796263267,\n \"acc_norm\": 0.945273631840796,\n\
\ \"acc_norm_stderr\": 0.016082815796263267\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759033,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759033\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6867469879518072,\n\
\ \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.6867469879518072,\n\
\ \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9181286549707602,\n \"acc_stderr\": 0.02102777265656387,\n\
\ \"acc_norm\": 0.9181286549707602,\n \"acc_norm_stderr\": 0.02102777265656387\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965824,\n \"mc2\": 0.5088923290302036,\n\
\ \"mc2_stderr\": 0.015447986277853607\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4586808188021228,\n \
\ \"acc_stderr\": 0.0137253773263428\n }\n}\n```"
repo_url: https://huggingface.co/AA051611/V0202
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|arc:challenge|25_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|gsm8k|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hellaswag|10_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T21-33-44.363250.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- '**/details_harness|winogrande|5_2024-02-03T21-33-44.363250.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T21-33-44.363250.parquet'
- config_name: results
data_files:
- split: 2024_02_03T21_33_44.363250
path:
- results_2024-02-03T21-33-44.363250.parquet
- split: latest
path:
- results_2024-02-03T21-33-44.363250.parquet
---
# Dataset Card for Evaluation run of AA051611/V0202
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/V0202](https://huggingface.co/AA051611/V0202) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__V0202",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T21:33:44.363250](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0202/blob/main/results_2024-02-03T21-33-44.363250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8475886247356212,
"acc_stderr": 0.023609522686145943,
"acc_norm": 0.8592262318029122,
"acc_norm_stderr": 0.023958294301700357,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965824,
"mc2": 0.5088923290302036,
"mc2_stderr": 0.015447986277853607
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.0140702655192688,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441375
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.004832934529120793,
"acc_norm": 0.8275243975303724,
"acc_norm_stderr": 0.003770211859118937
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8074074074074075,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.8074074074074075,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9407894736842105,
"acc_stderr": 0.01920689719680031,
"acc_norm": 0.9407894736842105,
"acc_norm_stderr": 0.01920689719680031
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197772,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197772
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.879245283018868,
"acc_stderr": 0.020054189400972373,
"acc_norm": 0.879245283018868,
"acc_norm_stderr": 0.020054189400972373
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9583333333333334,
"acc_stderr": 0.01671031580295999,
"acc_norm": 0.9583333333333334,
"acc_norm_stderr": 0.01671031580295999
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.028083594279575755,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.028083594279575755
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8893617021276595,
"acc_stderr": 0.020506145099008426,
"acc_norm": 0.8893617021276595,
"acc_norm_stderr": 0.020506145099008426
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7982456140350878,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.7982456140350878,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8413793103448276,
"acc_stderr": 0.030443500317583975,
"acc_norm": 0.8413793103448276,
"acc_norm_stderr": 0.030443500317583975
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.018296139984289767,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.018296139984289767
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9548387096774194,
"acc_stderr": 0.01181323762156236,
"acc_norm": 0.9548387096774194,
"acc_norm_stderr": 0.01181323762156236
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7881773399014779,
"acc_stderr": 0.028748983689941072,
"acc_norm": 0.7881773399014779,
"acc_norm_stderr": 0.028748983689941072
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646612,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646612
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9333333333333333,
"acc_stderr": 0.019478290326359282,
"acc_norm": 0.9333333333333333,
"acc_norm_stderr": 0.019478290326359282
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9646464646464646,
"acc_stderr": 0.013157318878046073,
"acc_norm": 0.9646464646464646,
"acc_norm_stderr": 0.013157318878046073
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909013,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909013
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.014491348171728305,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.014491348171728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.7666666666666667,
"acc_stderr": 0.025787874220959302,
"acc_norm": 0.7666666666666667,
"acc_norm_stderr": 0.025787874220959302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8949579831932774,
"acc_stderr": 0.019916300758805225,
"acc_norm": 0.8949579831932774,
"acc_norm_stderr": 0.019916300758805225
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6887417218543046,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.6887417218543046,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9467889908256881,
"acc_stderr": 0.009623385815462397,
"acc_norm": 0.9467889908256881,
"acc_norm_stderr": 0.009623385815462397
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.026491914727355164,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.026491914727355164
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9558823529411765,
"acc_stderr": 0.014413198705704825,
"acc_norm": 0.9558823529411765,
"acc_norm_stderr": 0.014413198705704825
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9409282700421941,
"acc_stderr": 0.015346597463888693,
"acc_norm": 0.9409282700421941,
"acc_norm_stderr": 0.015346597463888693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.9147982062780269,
"acc_stderr": 0.01873745202573731,
"acc_norm": 0.9147982062780269,
"acc_norm_stderr": 0.01873745202573731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9236641221374046,
"acc_stderr": 0.02328893953617375,
"acc_norm": 0.9236641221374046,
"acc_norm_stderr": 0.02328893953617375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9338842975206612,
"acc_stderr": 0.02268340369172331,
"acc_norm": 0.9338842975206612,
"acc_norm_stderr": 0.02268340369172331
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9629629629629629,
"acc_stderr": 0.018257067489429676,
"acc_norm": 0.9629629629629629,
"acc_norm_stderr": 0.018257067489429676
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9141104294478528,
"acc_stderr": 0.022014662933817535,
"acc_norm": 0.9141104294478528,
"acc_norm_stderr": 0.022014662933817535
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.8482142857142857,
"acc_stderr": 0.03405702838185695,
"acc_norm": 0.8482142857142857,
"acc_norm_stderr": 0.03405702838185695
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.02650144078476276,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.02650144078476276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9572649572649573,
"acc_stderr": 0.013250436685245011,
"acc_norm": 0.9572649572649573,
"acc_norm_stderr": 0.013250436685245011
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9527458492975734,
"acc_stderr": 0.007587612392626577,
"acc_norm": 0.9527458492975734,
"acc_norm_stderr": 0.007587612392626577
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8554913294797688,
"acc_stderr": 0.018929764513468728,
"acc_norm": 0.8554913294797688,
"acc_norm_stderr": 0.018929764513468728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8804469273743016,
"acc_stderr": 0.010850836082151255,
"acc_norm": 0.8804469273743016,
"acc_norm_stderr": 0.010850836082151255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9052287581699346,
"acc_stderr": 0.01677133127183646,
"acc_norm": 0.9052287581699346,
"acc_norm_stderr": 0.01677133127183646
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8938906752411575,
"acc_stderr": 0.017491946161301987,
"acc_norm": 0.8938906752411575,
"acc_norm_stderr": 0.017491946161301987
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.9104938271604939,
"acc_stderr": 0.01588414107393756,
"acc_norm": 0.9104938271604939,
"acc_norm_stderr": 0.01588414107393756
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.75177304964539,
"acc_stderr": 0.0257700156442904,
"acc_norm": 0.75177304964539,
"acc_norm_stderr": 0.0257700156442904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.8292046936114733,
"acc_stderr": 0.009611645934807811,
"acc_norm": 0.8292046936114733,
"acc_norm_stderr": 0.009611645934807811
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01722970778103902,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01722970778103902
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.012795357747288056,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.012795357747288056
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8454545454545455,
"acc_stderr": 0.03462262571262667,
"acc_norm": 0.8454545454545455,
"acc_norm_stderr": 0.03462262571262667
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8775510204081632,
"acc_stderr": 0.020985477705882164,
"acc_norm": 0.8775510204081632,
"acc_norm_stderr": 0.020985477705882164
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.945273631840796,
"acc_stderr": 0.016082815796263267,
"acc_norm": 0.945273631840796,
"acc_norm_stderr": 0.016082815796263267
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759033,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759033
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6867469879518072,
"acc_stderr": 0.03610805018031024,
"acc_norm": 0.6867469879518072,
"acc_norm_stderr": 0.03610805018031024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9181286549707602,
"acc_stderr": 0.02102777265656387,
"acc_norm": 0.9181286549707602,
"acc_norm_stderr": 0.02102777265656387
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965824,
"mc2": 0.5088923290302036,
"mc2_stderr": 0.015447986277853607
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.4586808188021228,
"acc_stderr": 0.0137253773263428
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jbilcke-hf/ai-tube-latentmusik | ---
license: cc-by-nc-sa-4.0
pretty_name: Latentmusik
---
## Description
The neverending music video channel
## Model
SVD
## LoRA
jbilcke-hf/sdxl-cinematic-2
## Voice
Cloée
## Prompt
A video channel which produces dance music videos all day long! |
jlbaker361/league_faces_captioned_priors | ---
dataset_info:
features:
- name: splash
dtype: image
- name: tile
dtype: image
- name: label
dtype: string
- name: caption
dtype: string
- name: PRIOR_0
dtype: image
- name: PRIOR_1
dtype: image
- name: PRIOR_2
dtype: image
- name: PRIOR_3
dtype: image
- name: PRIOR_4
dtype: image
splits:
- name: train
num_bytes: 838110962.0
num_examples: 378
download_size: 837523838
dataset_size: 838110962.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bphanzhu/NewDatasetName | ---
license: mit
---
|
camel-ai/ai_society_translated | ---
license: cc-by-nc-4.0
language:
- ar
- zh
- ko
- ja
- hi
- ru
- es
- fr
- de
- it
tags:
- instruction-finetuning
pretty_name: CAMEL AI Society Translated
task_categories:
- text-generation
arxiv: 2303.17760
extra_gated_prompt: "By using this data, you acknowledge and agree to utilize it solely for research purposes, recognizing that the dataset may contain inaccuracies due to its artificial generation through ChatGPT."
extra_gated_fields:
Name: text
Email: text
I will adhere to the terms and conditions of this dataset: checkbox
---
# **CAMEL: Communicative Agents for “Mind” Exploration of Large Scale Language Model Society**
- **Github:** https://github.com/lightaime/camel
- **Website:** https://www.camel-ai.org/
- **Arxiv Paper:** https://arxiv.org/abs/2303.17760
## Dataset Summary
The original AI Society dataset is in English and is composed of 25K conversations between two gpt-3.5-turbo agents. The dataset is obtained by running role-playing for a combination of 50 user roles and 50 assistant roles with each combination running over 10 tasks.
We provide translated versions of the original English dataset into ten languages: Arabic, Chinese, Korean, Japanese, Hindi, Russian, Spanish, French, German, and Italian in ".zip" format.
The dataset was translated by a prompting gpt-3.5-turbo to translate presented sentences into a particular language.
**Note:** Sometimes gpt decides not to translate particular keywords such as "Instruction", "Input", and "Solution". Therefore, cleaning might be needed depended on your use case.
## Data Fields
**The data fields for chat format (`ai_society_chat_{language}.zip`) are as follows:**
* `input`: {assistant\_role\_index}\_{user\_role\_index}\_{task\_index}, for example 001_002_003 refers to assistant role 1, user role 2, and task 3 from our text assistant role names, user role names and task text files.
* `role_1`: assistant role
* `role_2`: user role
* `original_task`: the general assigned task for the assistant and user to cooperate on.
* `specified_task`: the task after task specifier, this task is more specific than the original task.
* `message_k`: refers to the k<sup>_th_</sup> message of the conversation.
* `role_type`: refers to whether the agent is an assistant or a user.
* `role_name`: refers to the assigned assistant/user role.
* `role`: refers to the role of the agent during the message for openai api. [usually not needed]
* `content`: refers to the content of the message.
* `termination_reason`: refers to the reason of termination of the chat.
* `num_messages`: refers to the total number of messages in the chat.
**Download in python**
```
from huggingface_hub import hf_hub_download
# replace {language} by one of the following: ar, zh, ko, ja, hi, ru, es, fr, de, it
hf_hub_download(repo_id="camel-ai/ai_society_translated", repo_type="dataset", filename="ai_society_chat_{language}.zip",
local_dir="datasets/", local_dir_use_symlinks=False)
```
### Citation
```
@misc{li2023camel,
title={CAMEL: Communicative Agents for "Mind" Exploration of Large Scale Language Model Society},
author={Guohao Li and Hasan Abed Al Kader Hammoud and Hani Itani and Dmitrii Khizbullin and Bernard Ghanem},
year={2023},
eprint={2303.17760},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
## Disclaimer:
This data was synthetically generated by gpt-3.5-turbo and might contain incorrect information. The dataset is there only for research purposes.
---
license: cc-by-nc-4.0
--- |
Astonzzh/strategy_pred_v5_one_sentence_balanced | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: int64
- name: dialog
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 1385816
num_examples: 12580
- name: val
num_bytes: 157428.49337522197
num_examples: 1464
- name: test
num_bytes: 157536.02649911214
num_examples: 1465
download_size: 736785
dataset_size: 1700780.519874334
---
# Dataset Card for "strategy_pred_v5_one_sentence_balanced"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Patt/RTE_TH | ---
task_categories:
- text-classification
language:
- en
- th
license: cc-by-sa-4.0
---
# Dataset Card for RTE_TH
### Dataset Description
This dataset is Thai translated version of [RTE](https://huggingface.co/datasets/super_glue/viewer/rte) using google translate with [Multilingual Universal Sentence Encoder](https://arxiv.org/abs/1907.04307) to calculate score for Thai translation.
|
open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0 | ---
pretty_name: Evaluation run of BelalTab/finetuned-llama2-2048-v3.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BelalTab/finetuned-llama2-2048-v3.0](https://huggingface.co/BelalTab/finetuned-llama2-2048-v3.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T02:20:30.010370](https://huggingface.co/datasets/open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0/blob/main/results_2024-01-21T02-20-30.010370.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46772409508435164,\n\
\ \"acc_stderr\": 0.03438619577805813,\n \"acc_norm\": 0.4725657390462538,\n\
\ \"acc_norm_stderr\": 0.03515979149976784,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4620705172193864,\n\
\ \"mc2_stderr\": 0.015609209255063306\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866982,\n\
\ \"acc_norm\": 0.49829351535836175,\n \"acc_norm_stderr\": 0.014611305705056992\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5805616411073491,\n\
\ \"acc_stderr\": 0.004924586362301655,\n \"acc_norm\": 0.7708623780123481,\n\
\ \"acc_norm_stderr\": 0.004194190406000104\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.03067609659938918,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.03067609659938918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.03148955829745529,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.03148955829745529\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.044629175353369355,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.044629175353369355\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655802,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655802\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n\
\ \"acc_stderr\": 0.028396016402761,\n \"acc_norm\": 0.5290322580645161,\n\
\ \"acc_norm_stderr\": 0.028396016402761\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.038783721137112745,\n\
\ \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.038783721137112745\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6477064220183486,\n \"acc_stderr\": 0.02048056884399899,\n \"\
acc_norm\": 0.6477064220183486,\n \"acc_norm_stderr\": 0.02048056884399899\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5822784810126582,\n \"acc_stderr\": 0.032103530322412685,\n \
\ \"acc_norm\": 0.5822784810126582,\n \"acc_norm_stderr\": 0.032103530322412685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674054,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674054\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6462324393358876,\n\
\ \"acc_stderr\": 0.01709818470816191,\n \"acc_norm\": 0.6462324393358876,\n\
\ \"acc_norm_stderr\": 0.01709818470816191\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22793296089385476,\n\
\ \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.22793296089385476,\n\
\ \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n\
\ \"acc_stderr\": 0.028237769422085324,\n \"acc_norm\": 0.5530546623794212,\n\
\ \"acc_norm_stderr\": 0.028237769422085324\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.027607914087400473,\n\
\ \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.027607914087400473\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32269503546099293,\n \"acc_stderr\": 0.02788913930053478,\n \
\ \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.02788913930053478\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3363754889178618,\n\
\ \"acc_stderr\": 0.01206708307945222,\n \"acc_norm\": 0.3363754889178618,\n\
\ \"acc_norm_stderr\": 0.01206708307945222\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4477124183006536,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.4477124183006536,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.03512310964123935,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.03512310964123935\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.035282112582452306,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.035282112582452306\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4620705172193864,\n\
\ \"mc2_stderr\": 0.015609209255063306\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404686\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \
\ \"acc_stderr\": 0.009818090723727286\n }\n}\n```"
repo_url: https://huggingface.co/BelalTab/finetuned-llama2-2048-v3.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|arc:challenge|25_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|gsm8k|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hellaswag|10_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T02-20-30.010370.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- '**/details_harness|winogrande|5_2024-01-21T02-20-30.010370.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T02-20-30.010370.parquet'
- config_name: results
data_files:
- split: 2024_01_21T02_20_30.010370
path:
- results_2024-01-21T02-20-30.010370.parquet
- split: latest
path:
- results_2024-01-21T02-20-30.010370.parquet
---
# Dataset Card for Evaluation run of BelalTab/finetuned-llama2-2048-v3.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BelalTab/finetuned-llama2-2048-v3.0](https://huggingface.co/BelalTab/finetuned-llama2-2048-v3.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T02:20:30.010370](https://huggingface.co/datasets/open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0/blob/main/results_2024-01-21T02-20-30.010370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46772409508435164,
"acc_stderr": 0.03438619577805813,
"acc_norm": 0.4725657390462538,
"acc_norm_stderr": 0.03515979149976784,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.4620705172193864,
"mc2_stderr": 0.015609209255063306
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.014582236460866982,
"acc_norm": 0.49829351535836175,
"acc_norm_stderr": 0.014611305705056992
},
"harness|hellaswag|10": {
"acc": 0.5805616411073491,
"acc_stderr": 0.004924586362301655,
"acc_norm": 0.7708623780123481,
"acc_norm_stderr": 0.004194190406000104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.03067609659938918,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.03067609659938918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.03148955829745529,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.03148955829745529
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.044629175353369355,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.044629175353369355
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655802,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655802
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5290322580645161,
"acc_stderr": 0.028396016402761,
"acc_norm": 0.5290322580645161,
"acc_norm_stderr": 0.028396016402761
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.038783721137112745,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.038783721137112745
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4,
"acc_stderr": 0.024838811988033165,
"acc_norm": 0.4,
"acc_norm_stderr": 0.024838811988033165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6477064220183486,
"acc_stderr": 0.02048056884399899,
"acc_norm": 0.6477064220183486,
"acc_norm_stderr": 0.02048056884399899
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5822784810126582,
"acc_stderr": 0.032103530322412685,
"acc_norm": 0.5822784810126582,
"acc_norm_stderr": 0.032103530322412685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674054,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674054
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6462324393358876,
"acc_stderr": 0.01709818470816191,
"acc_norm": 0.6462324393358876,
"acc_norm_stderr": 0.01709818470816191
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22793296089385476,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.22793296089385476,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.028237769422085324,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.028237769422085324
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5617283950617284,
"acc_stderr": 0.027607914087400473,
"acc_norm": 0.5617283950617284,
"acc_norm_stderr": 0.027607914087400473
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.02788913930053478,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.02788913930053478
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3363754889178618,
"acc_stderr": 0.01206708307945222,
"acc_norm": 0.3363754889178618,
"acc_norm_stderr": 0.01206708307945222
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4477124183006536,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.4477124183006536,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123935,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123935
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.035282112582452306,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.035282112582452306
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.4620705172193864,
"mc2_stderr": 0.015609209255063306
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.012610826539404686
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.009818090723727286
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_eren23__DistiLabelOrca-TinyLLama-1.1B | ---
pretty_name: Evaluation run of eren23/DistiLabelOrca-TinyLLama-1.1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/DistiLabelOrca-TinyLLama-1.1B](https://huggingface.co/eren23/DistiLabelOrca-TinyLLama-1.1B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__DistiLabelOrca-TinyLLama-1.1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T12:31:51.008876](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__DistiLabelOrca-TinyLLama-1.1B/blob/main/results_2024-01-27T12-31-51.008876.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2579497336465992,\n\
\ \"acc_stderr\": 0.03077796101189773,\n \"acc_norm\": 0.25889304976710464,\n\
\ \"acc_norm_stderr\": 0.031529056639141094,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104183,\n \"mc2\": 0.38054949560093154,\n\
\ \"mc2_stderr\": 0.014019298506911837\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.34897610921501704,\n \"acc_stderr\": 0.013928933461382494,\n\
\ \"acc_norm\": 0.36177474402730375,\n \"acc_norm_stderr\": 0.014041957945038073\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45937064329814775,\n\
\ \"acc_stderr\": 0.004973280417705513,\n \"acc_norm\": 0.6115315674168492,\n\
\ \"acc_norm_stderr\": 0.004864058877626288\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n\
\ \"acc_stderr\": 0.03247781185995593,\n \"acc_norm\": 0.17037037037037037,\n\
\ \"acc_norm_stderr\": 0.03247781185995593\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677077,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677077\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.02749566368372406,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.02749566368372406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325635,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325635\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.03097543638684544,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.03097543638684544\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.021992016662370547,\n\
\ \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.021992016662370547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361255,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361255\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.02934311479809448,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.02934311479809448\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2848020434227331,\n\
\ \"acc_stderr\": 0.01613917409652258,\n \"acc_norm\": 0.2848020434227331,\n\
\ \"acc_norm_stderr\": 0.01613917409652258\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855716,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824768,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824768\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23468057366362452,\n\
\ \"acc_stderr\": 0.010824026872449355,\n \"acc_norm\": 0.23468057366362452,\n\
\ \"acc_norm_stderr\": 0.010824026872449355\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.024723110407677055,\n\
\ \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.024723110407677055\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815194,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.13877551020408163,\n \"acc_stderr\": 0.022131950419972655,\n\
\ \"acc_norm\": 0.13877551020408163,\n \"acc_norm_stderr\": 0.022131950419972655\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.03591566797824663,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.03591566797824663\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104183,\n \"mc2\": 0.38054949560093154,\n\
\ \"mc2_stderr\": 0.014019298506911837\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6085240726124704,\n \"acc_stderr\": 0.013717487071290856\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \
\ \"acc_stderr\": 0.0035275958887224655\n }\n}\n```"
repo_url: https://huggingface.co/eren23/DistiLabelOrca-TinyLLama-1.1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|arc:challenge|25_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|gsm8k|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hellaswag|10_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T12-31-51.008876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T12-31-51.008876.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- '**/details_harness|winogrande|5_2024-01-27T12-31-51.008876.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T12-31-51.008876.parquet'
- config_name: results
data_files:
- split: 2024_01_27T12_31_51.008876
path:
- results_2024-01-27T12-31-51.008876.parquet
- split: latest
path:
- results_2024-01-27T12-31-51.008876.parquet
---
# Dataset Card for Evaluation run of eren23/DistiLabelOrca-TinyLLama-1.1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/DistiLabelOrca-TinyLLama-1.1B](https://huggingface.co/eren23/DistiLabelOrca-TinyLLama-1.1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__DistiLabelOrca-TinyLLama-1.1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T12:31:51.008876](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__DistiLabelOrca-TinyLLama-1.1B/blob/main/results_2024-01-27T12-31-51.008876.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2579497336465992,
"acc_stderr": 0.03077796101189773,
"acc_norm": 0.25889304976710464,
"acc_norm_stderr": 0.031529056639141094,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104183,
"mc2": 0.38054949560093154,
"mc2_stderr": 0.014019298506911837
},
"harness|arc:challenge|25": {
"acc": 0.34897610921501704,
"acc_stderr": 0.013928933461382494,
"acc_norm": 0.36177474402730375,
"acc_norm_stderr": 0.014041957945038073
},
"harness|hellaswag|10": {
"acc": 0.45937064329814775,
"acc_stderr": 0.004973280417705513,
"acc_norm": 0.6115315674168492,
"acc_norm_stderr": 0.004864058877626288
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.03247781185995593,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.03247781185995593
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677077,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677077
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325635,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325635
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276862,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276862
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.03097543638684544,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.03097543638684544
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.021992016662370547,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.021992016662370547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809448,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809448
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2848020434227331,
"acc_stderr": 0.01613917409652258,
"acc_norm": 0.2848020434227331,
"acc_norm_stderr": 0.01613917409652258
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855716,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824768,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23468057366362452,
"acc_stderr": 0.010824026872449355,
"acc_norm": 0.23468057366362452,
"acc_norm_stderr": 0.010824026872449355
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.024723110407677055,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.024723110407677055
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815194,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.13877551020408163,
"acc_stderr": 0.022131950419972655,
"acc_norm": 0.13877551020408163,
"acc_norm_stderr": 0.022131950419972655
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.03591566797824663,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.03591566797824663
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104183,
"mc2": 0.38054949560093154,
"mc2_stderr": 0.014019298506911837
},
"harness|winogrande|5": {
"acc": 0.6085240726124704,
"acc_stderr": 0.013717487071290856
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224655
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/physics_dataset_standardized_cluster_4_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7508945
num_examples: 6876
download_size: 0
dataset_size: 7508945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_cluster_4_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Lajonbot__WizardLM-13B-V1.2-PL-lora_unload | ---
pretty_name: Evaluation run of Lajonbot/WizardLM-13B-V1.2-PL-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lajonbot/WizardLM-13B-V1.2-PL-lora_unload](https://huggingface.co/Lajonbot/WizardLM-13B-V1.2-PL-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__WizardLM-13B-V1.2-PL-lora_unload\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T09:37:24.771314](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__WizardLM-13B-V1.2-PL-lora_unload/blob/main/results_2023-10-16T09-37-24.771314.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003984899328859061,\n\
\ \"em_stderr\": 0.0006451805848102423,\n \"f1\": 0.06672923657718131,\n\
\ \"f1_stderr\": 0.0015525464124355034,\n \"acc\": 0.41089372554487175,\n\
\ \"acc_stderr\": 0.010708286080716344\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003984899328859061,\n \"em_stderr\": 0.0006451805848102423,\n\
\ \"f1\": 0.06672923657718131,\n \"f1_stderr\": 0.0015525464124355034\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11144806671721001,\n \
\ \"acc_stderr\": 0.008668021353794427\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638263\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Lajonbot/WizardLM-13B-V1.2-PL-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T09_37_24.771314
path:
- '**/details_harness|drop|3_2023-10-16T09-37-24.771314.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T09-37-24.771314.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T09_37_24.771314
path:
- '**/details_harness|gsm8k|5_2023-10-16T09-37-24.771314.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T09-37-24.771314.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T09_37_24.771314
path:
- '**/details_harness|winogrande|5_2023-10-16T09-37-24.771314.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T09-37-24.771314.parquet'
- config_name: results
data_files:
- split: 2023_10_16T09_37_24.771314
path:
- results_2023-10-16T09-37-24.771314.parquet
- split: latest
path:
- results_2023-10-16T09-37-24.771314.parquet
---
# Dataset Card for Evaluation run of Lajonbot/WizardLM-13B-V1.2-PL-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lajonbot/WizardLM-13B-V1.2-PL-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lajonbot/WizardLM-13B-V1.2-PL-lora_unload](https://huggingface.co/Lajonbot/WizardLM-13B-V1.2-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lajonbot__WizardLM-13B-V1.2-PL-lora_unload",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T09:37:24.771314](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__WizardLM-13B-V1.2-PL-lora_unload/blob/main/results_2023-10-16T09-37-24.771314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003984899328859061,
"em_stderr": 0.0006451805848102423,
"f1": 0.06672923657718131,
"f1_stderr": 0.0015525464124355034,
"acc": 0.41089372554487175,
"acc_stderr": 0.010708286080716344
},
"harness|drop|3": {
"em": 0.003984899328859061,
"em_stderr": 0.0006451805848102423,
"f1": 0.06672923657718131,
"f1_stderr": 0.0015525464124355034
},
"harness|gsm8k|5": {
"acc": 0.11144806671721001,
"acc_stderr": 0.008668021353794427
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638263
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AmanK1202/Pokemon_playground | ---
license: other
---
|
thangvip/cti-dataset | ---
dataset_info:
features:
- name: sentence_idx
dtype: int64
- name: words
sequence: string
- name: POS
sequence: int64
- name: tag
sequence: int64
splits:
- name: train
num_bytes: 13350196.989130436
num_examples: 13794
- name: test
num_bytes: 3338033.1604691073
num_examples: 3449
download_size: 2511496
dataset_size: 16688230.149599543
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
```python
#these dictionary are useful for this dataset
pos_2_id = {'#': 0, '$': 1, "''": 2, '(': 3, ')': 4, '.': 5, ':': 6, 'CC': 7, 'CD': 8, 'DT': 9, 'EX': 10, 'FW': 11, 'IN': 12, 'JJ': 13, 'JJR': 14, 'JJS': 15, 'MD': 16, 'NN': 17, 'NNP': 18, 'NNPS': 19, 'NNS': 20, 'PDT': 21, 'POS': 22, 'PRP': 23, 'PRP$': 24, 'RB': 25, 'RBR': 26, 'RBS': 27, 'RP': 28, 'TO': 29, 'VB': 30, 'VBD': 31, 'VBG': 32, 'VBN': 33, 'VBP': 34, 'VBZ': 35, 'WDT': 36, 'WP': 37, 'WP$': 38, 'WRB': 39}
id_2_pos = {0: '#', 1: '$', 2: "''", 3: '(', 4: ')', 5: '.', 6: ':', 7: 'CC', 8: 'CD', 9: 'DT', 10: 'EX', 11: 'FW', 12: 'IN', 13: 'JJ', 14: 'JJR', 15: 'JJS', 16: 'MD', 17: 'NN', 18: 'NNP', 19: 'NNPS', 20: 'NNS', 21: 'PDT', 22: 'POS', 23: 'PRP', 24: 'PRP$', 25: 'RB', 26: 'RBR', 27: 'RBS', 28: 'RP', 29: 'TO', 30: 'VB', 31: 'VBD', 32: 'VBG', 33: 'VBN', 34: 'VBP', 35: 'VBZ', 36: 'WDT', 37: 'WP', 38: 'WP$', 39: 'WRB'}
tag_2_id = {'B-application': 0, 'B-cve id': 1, 'B-edition': 2, 'B-file': 3, 'B-function': 4, 'B-hardware': 5, 'B-language': 6, 'B-method': 7, 'B-os': 8, 'B-parameter': 9, 'B-programming language': 10, 'B-relevant_term': 11, 'B-update': 12, 'B-vendor': 13, 'B-version': 14, 'I-application': 15, 'I-edition': 16, 'I-hardware': 17, 'I-os': 18, 'I-relevant_term': 19, 'I-update': 20, 'I-vendor': 21, 'I-version': 22, 'O': 23}
id_2_tag = {0: 'B-application', 1: 'B-cve id', 2: 'B-edition', 3: 'B-file', 4: 'B-function', 5: 'B-hardware', 6: 'B-language', 7: 'B-method', 8: 'B-os', 9: 'B-parameter', 10: 'B-programming language', 11: 'B-relevant_term', 12: 'B-update', 13: 'B-vendor', 14: 'B-version', 15: 'I-application', 16: 'I-edition', 17: 'I-hardware', 18: 'I-os', 19: 'I-relevant_term', 20: 'I-update', 21: 'I-vendor', 22: 'I-version', 23: 'O'}
```
|
yuvalkirstain/beautiful_interesting_spectacular_photo_anime_25000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: pclean
dtype: float64
splits:
- name: train
num_bytes: 773920358.0
num_examples: 956
download_size: 773924888
dataset_size: 773920358.0
---
# Dataset Card for "beautiful_interesting_spectacular_photo_anime_25000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mudassar93/piano_music | ---
dataset_info:
features:
- name: response
dtype: string
- name: instruction
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1063812
num_examples: 1823
download_size: 239640
dataset_size: 1063812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
belacan/bwolleh | ---
license: apache-2.0
license_name: ganzerfilm
license_link: LICENSE
---
|
gabrielmbmb/deitaset | ---
dataset_info:
features:
- name: id
dtype: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 18401019
num_examples: 50
download_size: 7796617
dataset_size: 18401019
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_40 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1291703872.0
num_examples: 251696
download_size: 1316184385
dataset_size: 1291703872.0
---
# Dataset Card for "chunk_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
christykoh/boolq_pt | ---
dataset_info:
features:
- name: question
dtype: string
- name: passage
dtype: string
- name: answer
dtype: bool
splits:
- name: train
num_bytes: 4550515
num_examples: 9427
- name: validation
num_bytes: 1578340
num_examples: 3270
download_size: 3842223
dataset_size: 6128855
---
# Dataset Card for "boolq_pt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Imadken/platypus_formatted | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 56821711.979539074
num_examples: 21857
- name: test
num_bytes: 6435205.953469715
num_examples: 2414
download_size: 30181422
dataset_size: 63256917.93300879
---
# Dataset Card for "platypus_formatted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MtCelesteMa/multiglue | ---
license: cc-by-4.0
task_categories:
- text-classification
size_categories:
- 100K<n<1M
language:
- en
multilinguality:
- monolingual
pretty_name: MultiGLUE
source_datasets:
- extended|glue
language_creators:
- found
annotations_creators:
- found
---
# Dataset Card for MultiGLUE
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is a combination of the cola, mrpc, qnli, qqp, rte, sst2, and wnli subsets of the GLUE dataset. Its intended use is to benchmark language models on multitask binary classification.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Like the GLUE dataset, this dataset is in English.
## Dataset Structure
### Data Instances
An example instance looks like this:
```
{
"label": 1,
"task": "cola",
"sentence1": "The sailors rode the breeze clear of the rocks.",
"sentence2": null
}
```
### Data Fields
- `task`: A `string` feature, indicating the GLUE task the instance is from.
- `sentence1`: A `string` feature.
- `sentence2`: A `string` feature.
- `label`: A classification label, either 0 or 1.
### Data Splits
- `train`: 551,282 instances
- `validation`: 48,564 instances
- `test`: 404,183 instances, no classification label (same as GLUE)
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
This dataset is created by combining the cola, mrpc, qnli, qqp, rte, sst2, and wnli subsets of the GLUE dataset.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hebrew_this_world | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- he
license:
- agpl-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: null
pretty_name: HebrewSentiment
dataset_info:
features:
- name: issue_num
dtype: int64
- name: page_count
dtype: int64
- name: date
dtype: string
- name: date_he
dtype: string
- name: year
dtype: string
- name: href
dtype: string
- name: pdf
dtype: string
- name: coverpage
dtype: string
- name: backpage
dtype: string
- name: content
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 678389435
num_examples: 2028
download_size: 678322912
dataset_size: 678389435
---
# Dataset Card for HebrewSentiment
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://thisworld.online/
- **Repository:** https://github.com/thisworld1/thisworld.online
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
HebrewThisWorld is a data set consists of 2028 issues of the newspaper 'This World' edited by Uri Avnery and were published between 1950 and 1989. Released under the AGPLv3 license.
Data Annotation:
### Supported Tasks and Leaderboards
Language modeling
### Languages
Hebrew
## Dataset Structure
csv file with "," delimeter
### Data Instances
Sample:
```json
{
"issue_num": 637,
"page_count": 16,
"date": "1950-01-01",
"date_he": "1 בינואר 1950",
"year": "1950",
"href": "https://thisworld.online/1950/637",
"pdf": "https://olam.eu-central-1.linodeobjects.com/pdfs/B-I0637-D010150.pdf",
"coverpage": "https://olam.eu-central-1.linodeobjects.com/pages/637/t-1.png",
"backpage": "https://olam.eu-central-1.linodeobjects.com/pages/637/t-16.png",
"content": "\nלפיד\nהנוער ־ בירושלים צילומים :\n\nב. רותנברג\n\nוזהו הלפיד\n...",
"url": "https://thisworld.online/api/1950/637"
}
```
### Data Fields
- `issue_num`: ID/Number of the issue
- `page_count`: Page count of the current issue
- `date`: Published date
- `date_he`: Published date in Hebrew
- `year`: Year of the issue
- `href`: URL to the issue to scan/print etc.
- `pdf`: URL to the issue to scan in pdf
- `coverpage`: URL to coverpage
- `backpage`: URL to backpage
- `content`: text content of the issue
- `url`: URL
### Data Splits
| | train |
|--------|------:|
| corpus | 2028 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[thisworld.online](https://thisworld.online/)
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
Researchers
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
GNU AGPLv3+
This is free software, and you are welcome to redistribute it under certain conditions.
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
### Citation Information
https://thisworld.online/
### Contributions
Thanks to [@lhoestq](https://github.com/lhoestq), [@imvladikon](https://github.com/imvladikon) for adding this dataset. |
vishaal27/YFCC15M_page_and_download_urls | ---
license: mit
task_categories:
- zero-shot-classification
- image-to-text
language:
- en
pretty_name: yfcc15m
size_categories:
- 10M<n<100M
---
## YFCC15M subset used for VLMs
This dataset contains the ~15M subset of YFCC100M used for training the models in the paper [Quality Not Quantity: On the Interaction between Dataset Design and Robustness of CLIP](https://arxiv.org/abs/2208.05516). The metadata provided in this repo contains both the page-urls and image-download-urls for downloading the dataset.
This dataset can be easily downloaded with [img2dataset](https://github.com/rom1504/img2dataset):
```bash
img2dataset --url_list yfcc15m_final_split_pageandimageurls.csv --input_format "csv" --output_format webdataset --output_folder images --processes_count 2 --thread_count 8 --resize_mode no --enable_wandb True
``` |
doushabao4766/msra_ner_k_V3_wc_bioes | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': B-ORG
'3': B-LOC
'4': I-PER
'5': I-ORG
'6': I-LOC
'7': E-PER
'8': E-ORG
'9': E-LOC
'10': S-PER
'11': S-ORG
'12': S-LOC
- name: knowledge
dtype: string
- name: token_words
sequence:
sequence: string
- name: knowledge_words
sequence:
sequence: string
splits:
- name: train
num_bytes: 334987989
num_examples: 45000
- name: test
num_bytes: 25028455
num_examples: 3442
download_size: 73312900
dataset_size: 360016444
---
# Dataset Card for "msra_ner_k_V3_wc_bioes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_find_passage_train30_eval20_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8260
num_examples: 80
- name: validation
num_bytes: 2522
num_examples: 20
download_size: 8644
dataset_size: 10782
---
# Dataset Card for "random_letter_find_passage_train30_eval20_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_90 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24455933856.25
num_examples: 254622
download_size: 22487639626
dataset_size: 24455933856.25
---
# Dataset Card for "chunk_90"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MBassNoBeat/japaozinvoz | ---
license: openrail
---
|
nakcnx/Thai-UCC | ---
license:
- cc-by-nc-sa-4.0
---
Thai UCC Corpus is translate from [UCC (Unhealthy Comments Corpus)](https://github.com/conversationai/unhealthy-conversations) by PyThaiNLP Translator and Google Translator. |
biglab/webui-70k-elements | ---
dataset_info:
features:
- name: image
dtype: image
- name: labels
sequence:
sequence: string
- name: contentBoxes
sequence:
sequence: float64
- name: paddingBoxes
sequence:
sequence: float64
- name: borderBoxes
sequence:
sequence: float64
- name: marginBoxes
sequence:
sequence: float64
- name: key_name
dtype: string
splits:
- name: train
num_bytes: 12719410165.962
num_examples: 173546
download_size: 11396715289
dataset_size: 12719410165.962
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information.
https://huggingface.co/datasets/biglab/webui-70k
```
from datasets import load_dataset
dataset = load_dataset("biglab/webui-70k-elements")
``` |
senseiberia/768_regularization_images | ---
license: gpl
---
|
CyberHarem/matara_okina_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matara_okina (Touhou)
This is the dataset of matara_okina (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, hat, long_hair, black_headwear, bangs, yellow_eyes, brown_headwear, hair_between_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 630.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 368.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1134 | 751.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 558.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1134 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/matara_okina_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matara_okina_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, detached_sleeves, green_skirt, long_sleeves, orange_cape, simple_background, solo, white_shirt, wide_sleeves, constellation_print, looking_at_viewer, smile, tabard, eyes_visible_through_hair, white_background, medium_breasts, hand_up, closed_mouth, hands_up, orange_sleeves, sun_symbol, blush, drum, sitting, standing, boots, open_mouth |
| 1 | 6 |  |  |  |  |  | 1girl, detached_sleeves, long_sleeves, medium_breasts, orange_cape, orange_sleeves, simple_background, solo, tabard, upper_body, white_shirt, wide_sleeves, constellation_print, looking_at_viewer, smile, white_background, blush, hand_up, open_mouth |
| 2 | 8 |  |  |  |  |  | 1girl, constellation_print, detached_sleeves, green_skirt, long_sleeves, looking_at_viewer, solo, tabard, white_shirt, wide_sleeves, smile, open_mouth, orange_cape, orange_sleeves |
| 3 | 12 |  |  |  |  |  | 1girl, closed_mouth, green_skirt, long_sleeves, looking_at_viewer, sitting, smile, solo, tabard, wide_sleeves, constellation_print, detached_sleeves, white_shirt, chair, drum, orange_sleeves, boots, black_footwear |
| 4 | 7 |  |  |  |  |  | 1girl, boots, closed_mouth, detached_sleeves, full_body, green_skirt, long_sleeves, looking_at_viewer, solo, tabard, wide_sleeves, black_footwear, constellation_print, smile, aura, simple_background, standing, white_background, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | green_skirt | long_sleeves | orange_cape | simple_background | solo | white_shirt | wide_sleeves | constellation_print | looking_at_viewer | smile | tabard | eyes_visible_through_hair | white_background | medium_breasts | hand_up | closed_mouth | hands_up | orange_sleeves | sun_symbol | blush | drum | sitting | standing | boots | open_mouth | upper_body | chair | black_footwear | full_body | aura |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------------|:---------------|:--------------|:--------------------|:-------|:--------------|:---------------|:----------------------|:--------------------|:--------|:---------|:----------------------------|:-------------------|:-----------------|:----------|:---------------|:-----------|:-----------------|:-------------|:--------|:-------|:----------|:-----------|:--------|:-------------|:-------------|:--------|:-----------------|:------------|:-------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | X | | X | | | | | X | X | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | X | | | | | | | X | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | X | X | | | | | X | | X | | | X | X | | X | | | X | X | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | X | X | | X | | | X | | | | | | | X | X | | | | X | X | X |
|
yuvalkirstain/beautiful_interesting_spectacular_photo_portrait_Marilyn_Monroe_25000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: pclean
dtype: float64
splits:
- name: train
num_bytes: 120049326.0
num_examples: 228
download_size: 120049639
dataset_size: 120049326.0
---
# Dataset Card for "beautiful_interesting_spectacular_photo_portrait_Marilyn_Monroe_25000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chronbmm/sanskrit-stemming-sentences | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: unsandhied
dtype: string
splits:
- name: train
num_bytes: 72623052
num_examples: 614286
- name: validation
num_bytes: 4340386
num_examples: 38227
- name: test
num_bytes: 3794629
num_examples: 32045
- name: test_500
num_bytes: 53161
num_examples: 500
- name: validation_500
num_bytes: 64578
num_examples: 500
download_size: 38399
dataset_size: 80875806
---
# Dataset Card for "sanskrit-stemming-sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wino_bias | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- coreference-resolution
paperswithcode_id: winobias
pretty_name: WinoBias
dataset_info:
- config_name: type1_anti
features:
- name: document_id
dtype: string
- name: part_number
dtype: string
- name: word_number
sequence: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': '"'
'1': ''''''
'2': '#'
'3': $
'4': (
'5': )
'6': ','
'7': .
'8': ':'
'9': '``'
'10': CC
'11': CD
'12': DT
'13': EX
'14': FW
'15': IN
'16': JJ
'17': JJR
'18': JJS
'19': LS
'20': MD
'21': NN
'22': NNP
'23': NNPS
'24': NNS
'25': NN|SYM
'26': PDT
'27': POS
'28': PRP
'29': PRP$
'30': RB
'31': RBR
'32': RBS
'33': RP
'34': SYM
'35': TO
'36': UH
'37': VB
'38': VBD
'39': VBG
'40': VBN
'41': VBP
'42': VBZ
'43': WDT
'44': WP
'45': WP$
'46': WRB
'47': HYPH
'48': XX
'49': NFP
'50': AFX
'51': ADD
'52': -LRB-
'53': -RRB-
'54': '-'
- name: parse_bit
sequence: string
- name: predicate_lemma
sequence: string
- name: predicate_framenet_id
sequence: string
- name: word_sense
sequence: string
- name: speaker
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-PERSON
'1': I-PERSON
'2': B-NORP
'3': I-NORP
'4': B-FAC
'5': I-FAC
'6': B-ORG
'7': I-ORG
'8': B-GPE
'9': I-GPE
'10': B-LOC
'11': I-LOC
'12': B-PRODUCT
'13': I-PRODUCT
'14': B-EVENT
'15': I-EVENT
'16': B-WORK_OF_ART
'17': I-WORK_OF_ART
'18': B-LAW
'19': I-LAW
'20': B-LANGUAGE
'21': I-LANGUAGE
'22': B-DATE
'23': I-DATE
'24': B-TIME
'25': I-TIME
'26': B-PERCENT
'27': I-PERCENT
'28': B-MONEY
'29': I-MONEY
'30': B-QUANTITY
'31': I-QUANTITY
'32': B-ORDINAL
'33': I-ORDINAL
'34': B-CARDINAL
'35': I-CARDINAL
'36': '*'
'37': '0'
'38': '-'
- name: verbal_predicates
sequence: string
- name: coreference_clusters
sequence: string
splits:
- name: validation
num_bytes: 380510
num_examples: 396
- name: test
num_bytes: 402893
num_examples: 396
download_size: 65383
dataset_size: 783403
- config_name: type1_pro
features:
- name: document_id
dtype: string
- name: part_number
dtype: string
- name: word_number
sequence: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': '"'
'1': ''''''
'2': '#'
'3': $
'4': (
'5': )
'6': ','
'7': .
'8': ':'
'9': '``'
'10': CC
'11': CD
'12': DT
'13': EX
'14': FW
'15': IN
'16': JJ
'17': JJR
'18': JJS
'19': LS
'20': MD
'21': NN
'22': NNP
'23': NNPS
'24': NNS
'25': NN|SYM
'26': PDT
'27': POS
'28': PRP
'29': PRP$
'30': RB
'31': RBR
'32': RBS
'33': RP
'34': SYM
'35': TO
'36': UH
'37': VB
'38': VBD
'39': VBG
'40': VBN
'41': VBP
'42': VBZ
'43': WDT
'44': WP
'45': WP$
'46': WRB
'47': HYPH
'48': XX
'49': NFP
'50': AFX
'51': ADD
'52': -LRB-
'53': -RRB-
'54': '-'
- name: parse_bit
sequence: string
- name: predicate_lemma
sequence: string
- name: predicate_framenet_id
sequence: string
- name: word_sense
sequence: string
- name: speaker
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-PERSON
'1': I-PERSON
'2': B-NORP
'3': I-NORP
'4': B-FAC
'5': I-FAC
'6': B-ORG
'7': I-ORG
'8': B-GPE
'9': I-GPE
'10': B-LOC
'11': I-LOC
'12': B-PRODUCT
'13': I-PRODUCT
'14': B-EVENT
'15': I-EVENT
'16': B-WORK_OF_ART
'17': I-WORK_OF_ART
'18': B-LAW
'19': I-LAW
'20': B-LANGUAGE
'21': I-LANGUAGE
'22': B-DATE
'23': I-DATE
'24': B-TIME
'25': I-TIME
'26': B-PERCENT
'27': I-PERCENT
'28': B-MONEY
'29': I-MONEY
'30': B-QUANTITY
'31': I-QUANTITY
'32': B-ORDINAL
'33': I-ORDINAL
'34': B-CARDINAL
'35': I-CARDINAL
'36': '*'
'37': '0'
'38': '-'
- name: verbal_predicates
sequence: string
- name: coreference_clusters
sequence: string
splits:
- name: validation
num_bytes: 379044
num_examples: 396
- name: test
num_bytes: 401705
num_examples: 396
download_size: 65516
dataset_size: 780749
- config_name: type2_anti
features:
- name: document_id
dtype: string
- name: part_number
dtype: string
- name: word_number
sequence: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': '"'
'1': ''''''
'2': '#'
'3': $
'4': (
'5': )
'6': ','
'7': .
'8': ':'
'9': '``'
'10': CC
'11': CD
'12': DT
'13': EX
'14': FW
'15': IN
'16': JJ
'17': JJR
'18': JJS
'19': LS
'20': MD
'21': NN
'22': NNP
'23': NNPS
'24': NNS
'25': NN|SYM
'26': PDT
'27': POS
'28': PRP
'29': PRP$
'30': RB
'31': RBR
'32': RBS
'33': RP
'34': SYM
'35': TO
'36': UH
'37': VB
'38': VBD
'39': VBG
'40': VBN
'41': VBP
'42': VBZ
'43': WDT
'44': WP
'45': WP$
'46': WRB
'47': HYPH
'48': XX
'49': NFP
'50': AFX
'51': ADD
'52': -LRB-
'53': -RRB-
'54': '-'
- name: parse_bit
sequence: string
- name: predicate_lemma
sequence: string
- name: predicate_framenet_id
sequence: string
- name: word_sense
sequence: string
- name: speaker
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-PERSON
'1': I-PERSON
'2': B-NORP
'3': I-NORP
'4': B-FAC
'5': I-FAC
'6': B-ORG
'7': I-ORG
'8': B-GPE
'9': I-GPE
'10': B-LOC
'11': I-LOC
'12': B-PRODUCT
'13': I-PRODUCT
'14': B-EVENT
'15': I-EVENT
'16': B-WORK_OF_ART
'17': I-WORK_OF_ART
'18': B-LAW
'19': I-LAW
'20': B-LANGUAGE
'21': I-LANGUAGE
'22': B-DATE
'23': I-DATE
'24': B-TIME
'25': I-TIME
'26': B-PERCENT
'27': I-PERCENT
'28': B-MONEY
'29': I-MONEY
'30': B-QUANTITY
'31': I-QUANTITY
'32': B-ORDINAL
'33': I-ORDINAL
'34': B-CARDINAL
'35': I-CARDINAL
'36': '*'
'37': '0'
'38': '-'
- name: verbal_predicates
sequence: string
- name: coreference_clusters
sequence: string
splits:
- name: validation
num_bytes: 368421
num_examples: 396
- name: test
num_bytes: 376926
num_examples: 396
download_size: 62555
dataset_size: 745347
- config_name: type2_pro
features:
- name: document_id
dtype: string
- name: part_number
dtype: string
- name: word_number
sequence: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': '"'
'1': ''''''
'2': '#'
'3': $
'4': (
'5': )
'6': ','
'7': .
'8': ':'
'9': '``'
'10': CC
'11': CD
'12': DT
'13': EX
'14': FW
'15': IN
'16': JJ
'17': JJR
'18': JJS
'19': LS
'20': MD
'21': NN
'22': NNP
'23': NNPS
'24': NNS
'25': NN|SYM
'26': PDT
'27': POS
'28': PRP
'29': PRP$
'30': RB
'31': RBR
'32': RBS
'33': RP
'34': SYM
'35': TO
'36': UH
'37': VB
'38': VBD
'39': VBG
'40': VBN
'41': VBP
'42': VBZ
'43': WDT
'44': WP
'45': WP$
'46': WRB
'47': HYPH
'48': XX
'49': NFP
'50': AFX
'51': ADD
'52': -LRB-
'53': -RRB-
'54': '-'
- name: parse_bit
sequence: string
- name: predicate_lemma
sequence: string
- name: predicate_framenet_id
sequence: string
- name: word_sense
sequence: string
- name: speaker
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-PERSON
'1': I-PERSON
'2': B-NORP
'3': I-NORP
'4': B-FAC
'5': I-FAC
'6': B-ORG
'7': I-ORG
'8': B-GPE
'9': I-GPE
'10': B-LOC
'11': I-LOC
'12': B-PRODUCT
'13': I-PRODUCT
'14': B-EVENT
'15': I-EVENT
'16': B-WORK_OF_ART
'17': I-WORK_OF_ART
'18': B-LAW
'19': I-LAW
'20': B-LANGUAGE
'21': I-LANGUAGE
'22': B-DATE
'23': I-DATE
'24': B-TIME
'25': I-TIME
'26': B-PERCENT
'27': I-PERCENT
'28': B-MONEY
'29': I-MONEY
'30': B-QUANTITY
'31': I-QUANTITY
'32': B-ORDINAL
'33': I-ORDINAL
'34': B-CARDINAL
'35': I-CARDINAL
'36': '*'
'37': '0'
'38': '-'
- name: verbal_predicates
sequence: string
- name: coreference_clusters
sequence: string
splits:
- name: validation
num_bytes: 366957
num_examples: 396
- name: test
num_bytes: 375144
num_examples: 396
download_size: 62483
dataset_size: 742101
- config_name: wino_bias
features:
- name: document_id
dtype: string
- name: part_number
dtype: string
- name: word_number
sequence: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': '"'
'1': ''''''
'2': '#'
'3': $
'4': (
'5': )
'6': ','
'7': .
'8': ':'
'9': '``'
'10': CC
'11': CD
'12': DT
'13': EX
'14': FW
'15': IN
'16': JJ
'17': JJR
'18': JJS
'19': LS
'20': MD
'21': NN
'22': NNP
'23': NNPS
'24': NNS
'25': NN|SYM
'26': PDT
'27': POS
'28': PRP
'29': PRP$
'30': RB
'31': RBR
'32': RBS
'33': RP
'34': SYM
'35': TO
'36': UH
'37': VB
'38': VBD
'39': VBG
'40': VBN
'41': VBP
'42': VBZ
'43': WDT
'44': WP
'45': WP$
'46': WRB
'47': HYPH
'48': XX
'49': NFP
'50': AFX
'51': ADD
'52': -LRB-
'53': -RRB-
- name: parse_bit
sequence: string
- name: predicate_lemma
sequence: string
- name: predicate_framenet_id
sequence: string
- name: word_sense
sequence: string
- name: speaker
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-PERSON
'1': I-PERSON
'2': B-NORP
'3': I-NORP
'4': B-FAC
'5': I-FAC
'6': B-ORG
'7': I-ORG
'8': B-GPE
'9': I-GPE
'10': B-LOC
'11': I-LOC
'12': B-PRODUCT
'13': I-PRODUCT
'14': B-EVENT
'15': I-EVENT
'16': B-WORK_OF_ART
'17': I-WORK_OF_ART
'18': B-LAW
'19': I-LAW
'20': B-LANGUAGE
'21': I-LANGUAGE
'22': B-DATE
'23': I-DATE
'24': B-TIME
'25': I-TIME
'26': B-PERCENT
'27': I-PERCENT
'28': B-MONEY
'29': I-MONEY
'30': B-QUANTITY
'31': I-QUANTITY
'32': B-ORDINAL
'33': I-ORDINAL
'34': B-CARDINAL
'35': I-CARDINAL
'36': '*'
'37': '0'
- name: verbal_predicates
sequence: string
splits:
- name: train
num_bytes: 173899234
num_examples: 150335
download_size: 268725744
dataset_size: 173899234
configs:
- config_name: type1_anti
data_files:
- split: validation
path: type1_anti/validation-*
- split: test
path: type1_anti/test-*
- config_name: type1_pro
data_files:
- split: validation
path: type1_pro/validation-*
- split: test
path: type1_pro/test-*
- config_name: type2_anti
data_files:
- split: validation
path: type2_anti/validation-*
- split: test
path: type2_anti/test-*
- config_name: type2_pro
data_files:
- split: validation
path: type2_pro/validation-*
- split: test
path: type2_pro/test-*
---
# Dataset Card for Wino_Bias dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [WinoBias](https://uclanlp.github.io/corefBias/overview)
- **Repository:**
- **Paper:** [Arxiv](https://arxiv.org/abs/1804.06876)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
WinoBias, a Winograd-schema dataset for coreference resolution focused on gender bias.
The corpus contains Winograd-schema style sentences with entities corresponding to people referred by their occupation (e.g. the nurse, the doctor, the carpenter).
### Supported Tasks and Leaderboards
The underlying task is coreference resolution.
### Languages
English
## Dataset Structure
### Data Instances
The dataset has 4 subsets: `type1_pro`, `type1_anti`, `type2_pro` and `type2_anti`.
The `*_pro` subsets contain sentences that reinforce gender stereotypes (e.g. mechanics are male, nurses are female), whereas the `*_anti` datasets contain "anti-stereotypical" sentences (e.g. mechanics are female, nurses are male).
The `type1` (*WB-Knowledge*) subsets contain sentences for which world knowledge is necessary to resolve the co-references, and `type2` (*WB-Syntax*) subsets require only the syntactic information present in the sentence to resolve them.
### Data Fields
- document_id = This is a variation on the document filename
- part_number = Some files are divided into multiple parts numbered as 000, 001, 002, ... etc.
- word_num = This is the word index of the word in that sentence.
- tokens = This is the token as segmented/tokenized in the Treebank.
- pos_tags = This is the Penn Treebank style part of speech. When parse information is missing, all part of speeches except the one for which there is some sense or proposition annotation are marked with a XX tag. The verb is marked with just a VERB tag.
- parse_bit = This is the bracketed structure broken before the first open parenthesis in the parse, and the word/part-of-speech leaf replaced with a *. The full parse can be created by substituting the asterix with the "([pos] [word])" string (or leaf) and concatenating the items in the rows of that column. When the parse information is missing, the first word of a sentence is tagged as "(TOP*" and the last word is tagged as "*)" and all intermediate words are tagged with a "*".
- predicate_lemma = The predicate lemma is mentioned for the rows for which we have semantic role information or word sense information. All other rows are marked with a "-".
- predicate_framenet_id = This is the PropBank frameset ID of the predicate in predicate_lemma.
- word_sense = This is the word sense of the word in Column tokens.
- speaker = This is the speaker or author name where available.
- ner_tags = These columns identifies the spans representing various named entities. For documents which do not have named entity annotation, each line is represented with an "*".
- verbal_predicates = There is one column each of predicate argument structure information for the predicate mentioned in predicate_lemma. If there are no predicates tagged in a sentence this is a single column with all rows marked with an "*".
### Data Splits
Dev and Test Split available
## Dataset Creation
### Curation Rationale
The WinoBias dataset was introduced in 2018 (see [paper](https://arxiv.org/abs/1804.06876)), with its original task being *coreference resolution*, which is a task that aims to identify mentions that refer to the same entity or person.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
The dataset was created by researchers familiar with the WinoBias project, based on two prototypical templates provided by the authors, in which entities interact in plausible ways.
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
"Researchers familiar with the [WinoBias] project"
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[Recent work](https://www.microsoft.com/en-us/research/uploads/prod/2021/06/The_Salmon_paper.pdf) has shown that this dataset contains grammatical issues, incorrect or ambiguous labels, and stereotype conflation, among other limitations.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Jieyu Zhao, Tianlu Wang, Mark Yatskar, Vicente Ordonez and Kai-Wei Chan
### Licensing Information
MIT Licence
### Citation Information
@article{DBLP:journals/corr/abs-1804-06876,
author = {Jieyu Zhao and
Tianlu Wang and
Mark Yatskar and
Vicente Ordonez and
Kai{-}Wei Chang},
title = {Gender Bias in Coreference Resolution: Evaluation and Debiasing Methods},
journal = {CoRR},
volume = {abs/1804.06876},
year = {2018},
url = {http://arxiv.org/abs/1804.06876},
archivePrefix = {arXiv},
eprint = {1804.06876},
timestamp = {Mon, 13 Aug 2018 16:47:01 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1804-06876.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
### Contributions
Thanks to [@akshayb7](https://github.com/akshayb7) for adding this dataset. Updated by [@JieyuZhao](https://github.com/JieyuZhao). |
autoevaluate/autoeval-eval-jeffdshen__inverse_superglue_mixedp1-jeffdshen__inverse-63643c-1665558891 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/inverse_superglue_mixedp1
eval_info:
task: text_zero_shot_classification
model: facebook/opt-350m
metrics: []
dataset_name: jeffdshen/inverse_superglue_mixedp1
dataset_config: jeffdshen--inverse_superglue_mixedp1
dataset_split: train
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-350m
* Dataset: jeffdshen/inverse_superglue_mixedp1
* Config: jeffdshen--inverse_superglue_mixedp1
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
dhuynh95/Magicoder-Evol-Instruct-2500-Deepseek-tokenized-0.5 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6041602
num_examples: 2500
download_size: 2764293
dataset_size: 6041602
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-human_aging-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 45666
num_examples: 223
download_size: 30517
dataset_size: 45666
---
# Dataset Card for "mmlu-human_aging-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_141 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1063909788
num_examples: 207309
download_size: 1086952329
dataset_size: 1063909788
---
# Dataset Card for "chunk_141"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joshikailashraj/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kyujinpy/KOR-Orca-Platypus-kiwi | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 72696825
num_examples: 45155
download_size: 38159019
dataset_size: 72696825
---
# ko-kiwi dataset🥝
## Merge datasets below
- Thank you for [HumanF-MarkrAI/WIKI_QA_Near_dedup](https://huggingface.co/datasets/HumanF-MarkrAI/WIKI_QA_Near_dedup). (Sampling about 10K)
- Use my dataset [kyujinpy/KOR-OpenOrca-Platypus](https://huggingface.co/datasets/kyujinpy/KOR-OpenOrca-Platypus).
|
jamestalentium/dialogsum_10_rm | ---
dataset_info:
features:
- name: id
dtype: string
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 9181.081861958266
num_examples: 10
download_size: 14579
dataset_size: 9181.081861958266
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dialogsum_10_rm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713084484 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26254
num_examples: 61
download_size: 13095
dataset_size: 26254
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
norabelrose/truthful_qa | ---
license: apache-2.0
---
|
ryanwible/openassistant-guanaco-prompt-reformatted | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39384
num_examples: 9846
- name: test
num_bytes: 2072
num_examples: 518
download_size: 3157
dataset_size: 41456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
jmaczan/rick-and-morty-scripts-llama-2 | ---
license: other
---
|
AmelieSchreiber/data_of_protein-rna_binding_sites | ---
license: mit
---
This is dataset "S1" from [Data of protein-RNA binding sites](https://www.sciencedirect.com/science/article/pii/S2352340916308022#s0035). |
ShadowSnow/java-test | ---
license: apache-2.0
---
|
KAUE24122023/EduardoDrummondGumball | ---
license: openrail
---
|
Aaryan333/MisaHub_WCE_Segmentation_train_val | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 131460889.53022918
num_examples: 2094
- name: validation
num_bytes: 32711768.699770816
num_examples: 524
download_size: 162770574
dataset_size: 164172658.23
---
# Dataset Card for "MisaHub_WCE_Segmentation_train_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712968751 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6278
num_examples: 15
download_size: 7823
dataset_size: 6278
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712968751"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
koutch/intro_prog | ---
dataset_info:
- config_name: dublin_metadata
features:
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: reference_solution
dtype: string
- name: description
dtype: string
- name: test
dtype: string
splits:
- name: train
num_bytes: 18983
num_examples: 36
- name: test
num_bytes: 17403
num_examples: 35
download_size: 41873
dataset_size: 36386
- config_name: singapore_metadata
features:
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: reference_solution
dtype: string
- name: description
dtype: string
- name: test
dtype: string
splits:
- name: train
num_bytes: 5577
num_examples: 5
download_size: 6139
dataset_size: 5577
- config_name: dublin_data
features:
- name: submission_id
dtype: int32
- name: func_code
dtype: string
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: description
dtype: string
- name: test
dtype: string
- name: correct
dtype: bool
- name: user
dtype: string
- name: academic_year
dtype: int32
splits:
- name: train
num_bytes: 4412068
num_examples: 7486
- name: test
num_bytes: 7737585
num_examples: 14259
download_size: 15756562
dataset_size: 12149653
- config_name: singapore_data
features:
- name: submission_id
dtype: int32
- name: func_code
dtype: string
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: description
dtype: string
- name: test
dtype: string
- name: correct
dtype: bool
splits:
- name: train
num_bytes: 5098928
num_examples: 4394
download_size: 5705043
dataset_size: 5098928
- config_name: dublin_repair
features:
- name: submission_id
dtype: int32
- name: func_code
dtype: string
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: description
dtype: string
- name: test
dtype: string
- name: annotation
dtype: string
- name: user
dtype: string
- name: academic_year
dtype: int32
splits:
- name: train
num_bytes: 229683
num_examples: 307
- name: test
num_bytes: 1451820
num_examples: 1698
download_size: 1929518
dataset_size: 1681503
- config_name: singapore_repair
features:
- name: submission_id
dtype: int32
- name: func_code
dtype: string
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: description
dtype: string
- name: test
dtype: string
- name: annotation
dtype: string
splits:
- name: train
num_bytes: 18979
num_examples: 18
download_size: 21737
dataset_size: 18979
- config_name: newcaledonia_metadata
features:
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: reference_solution
dtype: string
- name: description
dtype: string
- name: test
dtype: string
splits:
- name: train
num_bytes: 9053
num_examples: 9
download_size: 9760
dataset_size: 9053
- config_name: newcaledonia_data
features:
- name: submission_id
dtype: int32
- name: func_code
dtype: string
- name: assignment_id
dtype: string
- name: func_name
dtype: string
- name: description
dtype: string
- name: test
dtype: string
- name: correct
dtype: bool
splits:
- name: train
num_bytes: 932024
num_examples: 1201
download_size: 1198518
dataset_size: 932024
---
# Dataset Card for intro_prog
## Dataset Description
### Dataset Summary
IntroProg is a collection of students' submissions to assignments in various introductory programming courses offered at different universities.
Currently, the dataset contains submissions collected from Dublin City University, and the University of Singapore.
#### Dublin
The Dublin programming dataset is a dataset composed of students' submissions to introductory programming assignments at the University of Dublin.
Students submitted these programs for multiple programming courses over the duration of three academic years.
#### Singapore
The Singapore dataset contains 2442 correct and 1783 buggy program attempts by 361 undergraduate students
crediting an introduction to Python programming course at NUS (National University of Singapore).
### Supported Tasks and Leaderboards
#### "Metadata": Program synthesis
Similarly to the [Most Basic Python Programs](https://huggingface.co/datasets/mbpp) (mbpp), the data split can be used to evaluate
code generations models.
#### "Data"
The data configuration contains all the submissions as well as an indicator of whether these passed the required test.
#### "repair": Program refinement/repair
The "repair" configuration of each dataset is a subset of the "data" configuration
augmented with educators' annotations on the corrections to the buggy programs.
This configuration can be used for the task of program refinement. In [Computing Education Research](https://faculty.washington.edu/ajko/cer/) (CER),
methods for automatically repairing student programs are used to provide students with feedback and help them debug their code.
#### "bug": Bug classification
[Coming soon]
### Languages
The assignments were written in Python.
## Dataset Structure
One configuration is defined by one source dataset *dublin* or *singapore* and one subconfiguration ("metadata", "data", or "repair"):
* "dublin_metadata"
* "dublin_data"
* "dublin_repair"
* "singapore_metadata"
* "singapore_data"
* "singapore_repair"
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
Some of the fields are configuration specific
* submission_id: a unique number identifying the submission
* user: a unique string identifying the (anonymized) student who submitted the solution
* date: the timestamp at which the grading server received the submission
* func_code: the cleaned code submitted
* func_name: the name of the function that had to be implemented
* assingment_id: the unique (string) identifier of the assignment that had to be completed
* academic_year: the starting year of the academic year (e.g. 2015 for the academic year 2015-2016)
* module: the course/module
* test: a human eval-style string which can be used to execute the submitted solution on the provided test cases
* Description: a description of what the function is supposed to achieve
* correct: whether the solution passed all tests or not
### Data Splits
#### Dublin
The Dublin dataset is split into a training and validation set. The training set contains the submissions to the assignments
written during the academic years 2015-2016, and 2016-2017, while the test set contains programs written during the academic year 2017-2018.
#### Singapore
The Singapore dataset only contains a training split, which can be used as a test split for evaluating how your feedback
methods perform on an unseen dataset (if, for instance, you train your methods on the Dublin Dataset).
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
#### Dublin
#### Singapore
The data was released under a [GNU Lesser General Public License v3.0](https://github.com/githubhuyang/refactory/blob/master/LICENSE) license
### Citation Information
```
@inproceedings{azcona2019user2code2vec,
title={user2code2vec: Embeddings for Profiling Students Based on Distributional Representations of Source Code},
author={Azcona, David and Arora, Piyush and Hsiao, I-Han and Smeaton, Alan},
booktitle={Proceedings of the 9th International Learning Analytics & Knowledge Conference (LAK’19)},
year={2019},
organization={ACM}
}
@inproceedings{DBLP:conf/edm/CleuziouF21,
author = {Guillaume Cleuziou and
Fr{\'{e}}d{\'{e}}ric Flouvat},
editor = {Sharon I{-}Han Hsiao and
Shaghayegh (Sherry) Sahebi and
Fran{\c{c}}ois Bouchet and
Jill{-}J{\^{e}}nn Vie},
title = {Learning student program embeddings using abstract execution traces},
booktitle = {Proceedings of the 14th International Conference on Educational Data
Mining, {EDM} 2021, virtual, June 29 - July 2, 2021},
publisher = {International Educational Data Mining Society},
year = {2021},
timestamp = {Wed, 09 Mar 2022 16:47:22 +0100},
biburl = {https://dblp.org/rec/conf/edm/CleuziouF21.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch | ---
pretty_name: Evaluation run of TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T04:16:45.714438](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-19T04-16-45.714438.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.345008389261745,\n\
\ \"em_stderr\": 0.004868244118482663,\n \"f1\": 0.4264691694630892,\n\
\ \"f1_stderr\": 0.004672170372384348,\n \"acc\": 0.3832876668064886,\n\
\ \"acc_stderr\": 0.007708220968501149\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.345008389261745,\n \"em_stderr\": 0.004868244118482663,\n\
\ \"f1\": 0.4264691694630892,\n \"f1_stderr\": 0.004672170372384348\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \
\ \"acc_stderr\": 0.003282055917136951\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865348\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T04_16_45.714438
path:
- '**/details_harness|drop|3_2023-10-19T04-16-45.714438.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T04-16-45.714438.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T04_16_45.714438
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-16-45.714438.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T04-16-45.714438.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T04_16_45.714438
path:
- '**/details_harness|winogrande|5_2023-10-19T04-16-45.714438.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T04-16-45.714438.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- results_2023-08-28T22:44:43.350947.parquet
- split: 2023_10_19T04_16_45.714438
path:
- results_2023-10-19T04-16-45.714438.parquet
- split: latest
path:
- results_2023-10-19T04-16-45.714438.parquet
---
# Dataset Card for Evaluation run of TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T04:16:45.714438](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-19T04-16-45.714438.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.345008389261745,
"em_stderr": 0.004868244118482663,
"f1": 0.4264691694630892,
"f1_stderr": 0.004672170372384348,
"acc": 0.3832876668064886,
"acc_stderr": 0.007708220968501149
},
"harness|drop|3": {
"em": 0.345008389261745,
"em_stderr": 0.004868244118482663,
"f1": 0.4264691694630892,
"f1_stderr": 0.004672170372384348
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.003282055917136951
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.012134386019865348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
polinaeterna/test_push_dataset_infos_json | ---
dataset_info:
- config_name: default
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 3200
num_examples: 200
download_size: 3299
dataset_size: 4800
- config_name: v2
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 3200
num_examples: 200
download_size: 0
dataset_size: 3200
configs_kwargs:
- config_name: default
data_dir: ./
- config_name: v2
data_dir: v2
---
# Dataset Card for "test_push_dataset_infos_json"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/thematic1bembed | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 923680885
num_examples: 1273612
download_size: 288796772
dataset_size: 923680885
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amlan107/syn_0 | ---
dataset_info:
features:
- name: bn
dtype: string
- name: ck
dtype: string
splits:
- name: train
num_bytes: 1794536.5235337194
num_examples: 12016
download_size: 839316
dataset_size: 1794536.5235337194
---
# Dataset Card for "syn_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nbeerbower__flammen6-mistral-7B | ---
pretty_name: Evaluation run of nbeerbower/flammen6-mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/flammen6-mistral-7B](https://huggingface.co/nbeerbower/flammen6-mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__flammen6-mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T21:05:30.920430](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen6-mistral-7B/blob/main/results_2024-03-14T21-05-30.920430.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6463686254143772,\n\
\ \"acc_stderr\": 0.032083108489198105,\n \"acc_norm\": 0.6463981879688413,\n\
\ \"acc_norm_stderr\": 0.03274198265924435,\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6347674012321349,\n\
\ \"mc2_stderr\": 0.015145748610941845\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6561433447098977,\n \"acc_stderr\": 0.013880644570156218,\n\
\ \"acc_norm\": 0.6919795221843004,\n \"acc_norm_stderr\": 0.013491429517292038\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6886078470424218,\n\
\ \"acc_stderr\": 0.004621163476949209,\n \"acc_norm\": 0.869946225851424,\n\
\ \"acc_norm_stderr\": 0.0033567515689037672\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.036390575699529276,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.036390575699529276\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042114,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.01274920600765747,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.01274920600765747\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6347674012321349,\n\
\ \"mc2_stderr\": 0.015145748610941845\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242914\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.01267929754951543\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/flammen6-mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|arc:challenge|25_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|gsm8k|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hellaswag|10_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T21-05-30.920430.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T21-05-30.920430.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- '**/details_harness|winogrande|5_2024-03-14T21-05-30.920430.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T21-05-30.920430.parquet'
- config_name: results
data_files:
- split: 2024_03_14T21_05_30.920430
path:
- results_2024-03-14T21-05-30.920430.parquet
- split: latest
path:
- results_2024-03-14T21-05-30.920430.parquet
---
# Dataset Card for Evaluation run of nbeerbower/flammen6-mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/flammen6-mistral-7B](https://huggingface.co/nbeerbower/flammen6-mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__flammen6-mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T21:05:30.920430](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen6-mistral-7B/blob/main/results_2024-03-14T21-05-30.920430.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6463686254143772,
"acc_stderr": 0.032083108489198105,
"acc_norm": 0.6463981879688413,
"acc_norm_stderr": 0.03274198265924435,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6347674012321349,
"mc2_stderr": 0.015145748610941845
},
"harness|arc:challenge|25": {
"acc": 0.6561433447098977,
"acc_stderr": 0.013880644570156218,
"acc_norm": 0.6919795221843004,
"acc_norm_stderr": 0.013491429517292038
},
"harness|hellaswag|10": {
"acc": 0.6886078470424218,
"acc_stderr": 0.004621163476949209,
"acc_norm": 0.869946225851424,
"acc_norm_stderr": 0.0033567515689037672
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.036390575699529276,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.036390575699529276
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297794,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042114,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.01274920600765747,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.01274920600765747
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6347674012321349,
"mc2_stderr": 0.015145748610941845
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242914
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.01267929754951543
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Tonyhacker/fatimanoya | ---
license: openrail
---
|
kenil-samyak22/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
James4Ever0/network_security_questions | ---
license: wtfpl
---
This dataset contains a single file full of network security questions in Chinese.
Could be used as good initial sources for scrapers, though not good as your browsing history.
|
mrm8488/CHISTES_spanish_jokes | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: keywords
dtype: string
- name: funny
dtype: int64
- name: category
dtype: string
splits:
- name: train
num_bytes: 814817
num_examples: 2419
download_size: 504749
dataset_size: 814817
task_categories:
- text-classification
- text-generation
language:
- es
pretty_name: chistes
---
# Dataset Card for "CHISTES_spanish_jokes"
Dataset from [Workshop for NLP introduction with Spanish jokes](https://github.com/liopic/chistes-nlp)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_MagicTelescope_gosdt_l512_d3_sd3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2606790213
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_MagicTelescope_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gguichard/myridade_dbg_aligned_ontologie_filter_myriade | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: labels
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 47868666
num_examples: 98206
download_size: 11206988
dataset_size: 47868666
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "myridade_dbg_aligned_ontologie_filter_myriade"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AbdomenAtlas/AbdomenAtlas1.0Mini | ---
license: unknown
task_categories:
- image-segmentation
tags:
- medical
pretty_name: AbdomenAtlas 1.0 Mini
size_categories:
- 1K<n<10K
---
# Dataset Summary
The largest, fully-annotated CT dataset to date, including 5,195 annotated CT volumes (with spleen, liver, kidneys, stomach,
gallbladder, pancreas, aorta, and IVC annotations).
---
# Join the AbdomenAtlas Benchmarking Project
The Benchmarking Project aims to compare diverse semantic segmentation and pre-training algorithms.
We, the CCVL research group at Johns Hopkins University, invite creators of these algorithms to contribute to the initiative.
With our support, contributors will train their methodologies on the largest annotated CT dataset to date.
Subsequently, we will evaluate the trained models using a large internal dataset at Johns Hopkins University.
Contributors to this large-scale project will be offered authorship in the resulting paper.
If you are the creator of a semantic segmentation or pre-training algorithm and wish to advance medical AI by participating
in the Benchmark Project, please reach out to pedro.salvadorbassi2@unibo.it.
---
# Downloading Instructions
#### 1- Install the Hugging Face library:
```bash
pip install -U "huggingface_hub[cli]"
```
#### 2- Download the dataset:
```bash
mkdir AbdomenAtlas
cd AbdomenAtlas
huggingface-cli download AbdomenAtlas/AbdomenAtlas1.0Mini --repo-type dataset --local-dir . --cache-dir ./cache
```
<details>
<summary style="margin-left: 25px;">[Optional] Resume downloading</summary>
<div style="margin-left: 25px;">
In case you had a previous interrupted download, resume it by adding “--resume-download” to the download command:
```bash
huggingface-cli download AbdomenAtlas/AbdomenAtlas1.0Mini --repo-type dataset --local-dir . --cache-dir ./cache --resume-download
```
</div>
</details>
## Paper
<b>AbdomenAtlas-8K: Annotating 8,000 CT Volumes for Multi-Organ Segmentation in Three Weeks</b> <br/>
[Chongyu Qu](https://github.com/Chongyu1117)<sup>1</sup>, [Tiezheng Zhang](https://github.com/ollie-ztz)<sup>1</sup>, [Hualin Qiao](https://www.linkedin.com/in/hualin-qiao-a29438210/)<sup>2</sup>, [Jie Liu](https://ljwztc.github.io/)<sup>3</sup>, [Yucheng Tang](https://scholar.google.com/citations?hl=en&user=0xheliUAAAAJ)<sup>4</sup>, [Alan L. Yuille](https://www.cs.jhu.edu/~ayuille/)<sup>1</sup>, and [Zongwei Zhou](https://www.zongweiz.com/)<sup>1,*</sup> <br/>
<sup>1 </sup>Johns Hopkins University, <br/>
<sup>2 </sup>Rutgers University, <br/>
<sup>3 </sup>City University of Hong Kong, <br/>
<sup>4 </sup>NVIDIA <br/>
NeurIPS 2023 <br/>
[paper](https://www.cs.jhu.edu/~alanlab/Pubs23/qu2023abdomenatlas.pdf) | [code](https://github.com/MrGiovanni/AbdomenAtlas) | [dataset](https://huggingface.co/datasets/AbdomenAtlas/AbdomenAtlas1.0Mini) | [annotation](https://www.dropbox.com/scl/fi/28l5vpxrn212r2ejk32xv/AbdomenAtlas.tar.gz?rlkey=vgqmao4tgv51hv5ew24xx4xpm&dl=0) | [poster](document/neurips_poster.pdf)
<b>AbdomenAtlas-8K: Human-in-the-Loop Annotating Eight Anatomical Structures for 8,448 Three-Dimensional Computed Tomography Volumes in Three Weeks</b> <br/>
[Chongyu Qu](https://github.com/Chongyu1117)<sup>1</sup>, [Tiezheng Zhang](https://github.com/ollie-ztz)<sup>1</sup>, [Hualin Qiao](https://www.linkedin.com/in/hualin-qiao-a29438210/)<sup>2</sup>, [Jie Liu](https://ljwztc.github.io/)<sup>3</sup>, [Yucheng Tang](https://scholar.google.com/citations?hl=en&user=0xheliUAAAAJ)<sup>4</sup>, [Alan L. Yuille](https://www.cs.jhu.edu/~ayuille/)<sup>1</sup>, and [Zongwei Zhou](https://www.zongweiz.com/)<sup>1,*</sup> <br/>
<sup>1 </sup>Johns Hopkins University, <br/>
<sup>2 </sup>Rutgers University, <br/>
<sup>3 </sup>City University of Hong Kong, <br/>
<sup>4 </sup>NVIDIA <br/>
RSNA 2023 (Oral Presentation) <br/>
[paper](document/rsna_abstract.pdf) | [code](https://github.com/MrGiovanni/AbdomenAtlas) | [slides](document/rsna_slides.pdf)
## Citation
```
@article{qu2023abdomenatlas,
title={Abdomenatlas-8k: Annotating 8,000 CT volumes for multi-organ segmentation in three weeks},
author={Qu, Chongyu and Zhang, Tiezheng and Qiao, Hualin and Tang, Yucheng and Yuille, Alan L and Zhou, Zongwei},
journal={Advances in Neural Information Processing Systems},
volume={36},
year={2023}
}
@inproceedings{li2024well,
title={How Well Do Supervised Models Transfer to 3D Image Segmentation?},
author={Li, Wenxuan and Yuille, Alan and Zhou, Zongwei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024}
}
```
## Acknowledgements
This work was supported by the Lustgarten Foundation for Pancreatic Cancer Research and partially by the Patrick J. McGovern Foundation Award. We appreciate the effort of the MONAI Team to provide open-source code for the community. |
PKU-Alignment/BeaverTails | ---
license: cc-by-nc-4.0
task_categories:
- text-classification
language:
- en
tags:
- safe
- safety
- ai-safety
- moderation
- rejection-sampling
- llm
- lm
- human-feedback
size_categories:
- 100K<n<1M
configs:
- config_name: default
data_files:
- split: 330k_train
path: round0/330k/train.jsonl.xz
- split: 330k_test
path: round0/330k/test.jsonl.xz
- split: 30k_train
path: round0/30k/train.jsonl.gz
- split: 30k_test
path: round0/30k/test.jsonl.gz
---
# Dataset Card for BeaverTails
BeaverTails is an AI safety-focused collection comprising a series of datasets.
This repository includes human-labeled data consisting of question-answer (QA) pairs, each identified with their corresponding harm categories.
It should be noted that a single QA pair can be associated with more than one category.

The 14 harm categories are defined as follows:
1. **Animal Abuse:** This involves any form of cruelty or harm inflicted on animals, including physical abuse, neglect, or any behavior that causes distress or harm to an animal. It also includes content that promotes such behavior.
1. **Child Abuse:** This encompasses any form of physical, emotional, or sexual abuse directed toward children. It can also include neglect, exploitation, or any behavior that harms a child or violates their rights. Content that promotes or glorifies such behavior also falls under this category.
1. **Controversial Topics, Politics:** This refers to discussions or statements related to controversial topics, political ideologies, or sensitive issues that could lead to disagreement or conflict. These topics can be divisive and could potentially cause harm if not handled respectfully and thoughtfully.
1. **Discrimination, Stereotype, Injustice:** Discrimination refers to treating people unfairly or unequally based on their personal characteristics or affiliations such as race, age, gender, religion, etc. Stereotyping involves holding simplified and generalized beliefs about particular groups, which often leads to prejudices and discrimination. Injustice pertains to unfair treatment or violation of the rights of individuals or groups.
1. **Drug Abuse, Weapons, Banned Substance:** This category includes the misuse or abuse of illegal drugs, prescription drugs, or any other banned substances. It also involves the unauthorized or illegal possession, use, or trade of weapons.
1. **Financial Crime, Property Crime, Theft:** This includes all forms of illegal activities that lead to financial loss such as fraud, embezzlement, money laundering, bribery, and other forms of corruption. Property crime and theft involve the act of taking someone's property without their consent.
1. **Hate Speech, Offensive Language:** This category encompasses any form of communication or content that expresses intense hostility or violence towards an individual or group based on aspects like race, religion, ethnicity, nationality, gender, sexual orientation, or any other defining characteristics. This also includes language that is considered vulgar, crude, or offensive.
1. **Misinformation Regarding ethics, laws, and safety:** This category involves the spreading of incorrect or misleading information related to ethical issues, laws, regulations, or any matter related to safety. Misinformation can cause harm by influencing people's beliefs or actions negatively.
1. **Non-Violent Unethical Behavior:** This refers to any behavior that is morally or ethically wrong, but does not involve violence. Examples can include dishonest behaviors like lying, cheating, deceiving, spreading rumors, betraying trust, or any other form of unethical conduct.
1. **Privacy Violation:** This refers to actions that invade an individual's privacy, such as unauthorized access, use, disclosure, or collection of personal data. It also includes stalking, doxing, or any other form of intrusion into someone's personal life without their consent.
1. **Self-Harm:** This includes any form of self-inflicted harm, whether physical or emotional, or any content that promotes, glorifies, or encourages self-harm. This can also include discussions of suicidal thoughts or plans.
1. **Sexually Explicit, Adult Content:** This category includes any content that depicts explicit sexual activities or adult themes that are considered inappropriate or offensive. This can include pornography, explicit sexual text or messages, and any other form of sexually explicit material.
1. **Terrorism, Organized Crime:** This pertains to any form of content or action related to terrorism or organized crime, including endorsing or promoting terrorist activities, participating in organized criminal activities, or spreading propaganda for such groups.
1. **Violence, Aiding and Abetting, Incitement:** This involves any form of physical harm, threat, or violent behavior towards individuals or groups. Aiding and abetting refers to the act of helping, supporting, or encouraging such violent behaviors or illegal activities. Incitement pertains to the act of provoking or stirring up harmful, violent, or illegal actions.
**Disclaimer**: The BeaverTails dataset and its family contain content that may be offensive or upsetting.
Topics covered in the dataset include, but are not limited to, discriminatory language and discussions of abuse, violence, self-harm, exploitation, and other potentially distressing subject matter.
Please engage with the dataset responsibly and in accordance with your own personal risk tolerance.
The dataset is intended for research purposes, specifically for research aimed at creating safer and less harmful AI systems.
The views and opinions expressed in the dataset do not represent the views of the PKU-Alignment Team or any of its members.
It is important to emphasize that the dataset should not be used for training dialogue agents, as doing so may likely result in harmful model behavior.
The primary objective of this dataset is to facilitate research that could minimize or prevent the harm caused by AI systems.
## Usage
The code snippet below demonstrates how to load the QA-Classification dataset:
```python
from datasets import load_dataset
# Load the whole dataset
dataset = load_dataset('PKU-Alignment/BeaverTails')
# Load only the round 0 dataset
round0_dataset = load_dataset('PKU-Alignment/BeaverTails', data_dir='round0')
# Load the training dataset
train_dataset = load_dataset('PKU-Alignment/BeaverTails', split='train')
test_dataset = load_dataset('PKU-Alignment/BeaverTails', split='test')
```
## Papers
You can find more information in our Paper:
- **Dataset Paper:** <https://arxiv.org/abs/2307.04657>
## Contact
The original authors host this dataset on GitHub here: https://github.com/PKU-Alignment/beavertails
## License
BeaverTails dataset and its family are released under the CC BY-NC 4.0 License.
|
autoevaluate/autoeval-staging-eval-project-squad_v2-4938eeea-11665554 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: nbroad/xdistil-l12-h384-squad2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: nbroad/xdistil-l12-h384-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model. |
usamaamjad23/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
murilor9/retardado | ---
license: openrail
---
|
somosnlp/constitucion-politica-del-peru-1993-qa-gemma-2b-it-format | ---
dataset_info:
features:
- name: pregunta
dtype: string
- name: respuesta
dtype: string
splits:
- name: train
num_bytes: 1807541
num_examples: 2075
download_size: 680292
dataset_size: 1807541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mithmith/wowfishing | ---
license: unknown
---
|
plaguss/snli-small | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for snli-small
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("plaguss/snli-small")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("plaguss/snli-small")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| premise | Premise | TextField | True | False |
| hypothesis | Hypothesis | TextField | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| label | The hypothesis entails the premise, neither entails nor contradict each other, or the hypothesis contradicts the premise? | LabelQuestion | True | N/A | ['0', '1', '2'] |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"fields": {
"hypothesis": "A person is training his horse for a competition.",
"premise": "A person on a horse jumps over a broken down airplane."
},
"metadata": {},
"responses": [
{
"status": "submitted",
"values": {
"label": {
"value": "1"
}
}
}
],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"hypothesis": "A person is training his horse for a competition.",
"label": [
{
"status": "submitted",
"user_id": null,
"value": "1"
}
],
"label-suggestion": null,
"label-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"metadata": "{}",
"premise": "A person on a horse jumps over a broken down airplane."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **premise** is of type `TextField`.
* **hypothesis** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **label** is of type `LabelQuestion` with the following allowed values ['0', '1', '2'].
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **label-suggestion** is of type `label_selection` with the following allowed values ['0', '1', '2'].
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Premise: A string used to determine the truthfulness of the hypothesis, Hypothesis: A string that may be true, false, or whose truth conditions may not be knowable when compared to the premise
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_v5-mathemak-2bec9f-2053467110 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_v5
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-13b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_v5
dataset_config: mathemakitten--winobias_antistereotype_test_v5
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-13b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_v5
* Config: mathemakitten--winobias_antistereotype_test_v5
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
medicreal/minecraft-stuff | ---
license: openrail
---
|
PercyTG/AIVC | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.