datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
pizzagalaxy/dougiesmodels | ---
license: unknown
---
|
Pedrampedram/MarketMail-AI-Dataset | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 7791
num_examples: 10
download_size: 11307
dataset_size: 7791
---
# Dataset Card for "MarketMail-AI-Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pythainlp/thailand-policy-statements | ---
dataset_info:
features:
- name: n_cabinet
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 3019140
num_examples: 60
download_size: 1038348
dataset_size: 3019140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc0-1.0
task_categories:
- text-generation
language:
- th
size_categories:
- n<1K
---
# Thailand Policy Statements
Collect all Thailand policy statements from Thailand government
License: CC-0
This project is a part of PyThaiNLP project.
Github: [https://github.com/PyThaiNLP/thailand-policy-statements](https://github.com/PyThaiNLP/thailand-policy-statements)
## Citation
> Phatthiyaphaibun, W. (2024). Thailand Policy Statements (1.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.10842589
or
```
@dataset{phatthiyaphaibun_2024_10842589,
author = {Phatthiyaphaibun, Wannaphong},
title = {Thailand Policy Statements},
month = mar,
year = 2024,
publisher = {Zenodo},
version = {1.0},
doi = {10.5281/zenodo.10842589},
url = {https://doi.org/10.5281/zenodo.10842589}
}
``` |
lhallee/uniref50_50-512 | ---
dataset_info:
features:
- name: uniref
dtype: string
splits:
- name: train
num_bytes: 10696656442
num_examples: 51521691
download_size: 10582703793
dataset_size: 10696656442
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "uniref50_50-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Marlon154/moral-number-corpus | ---
license: cc-by-sa-4.0
language:
- en
size_categories:
- 1K<n<10K
---
# A Perspectivist Corpus for Moral and Social Judgements
We constructed a corpus of moral and social judgements (questions are derived from the [Commonsense Norm Bank](https://arxiv.org/abs/2110.07574)) that asks people to fill in number ranges that do not change a given judgement.
Our corpus was crowdsourced from 30 annotators and contains 898 statements for a total of 3k annotations.
This work adds to available moral and social judgement data by providing ranges of (un)acceptable behaviors and annotator demographics.
This work supports perspectivist and pluralistic approaches with a goal of creating models that can understand and express multiple points of view, whose point of view it is, and uncertainty about definitive answers.
The number to replace is randomly choosen from all numbers in a given question.
## Structure of ``annotated_questions.csv``
### Core Data
- id: A unique identifier for each data entry. It has the following structure:
- Prefix:
- "ff" indicates a freeform question and
- "yn" indicates a yes-no question from the Commonsense Norm Bank.
- Separator: "x"
- Subset:
- "tr" for the training set.
- "te" for the test set.
- "va" for the validation set.
- Separator: "x"
- Subset ID: A unique numerical ID assigned within the specified subset.
- number_to_replace: The original number within the statement that participants are asked to replace.
- numeric_num: The original numeric value present in the statement (represented as a list in case of multiple numbers).
- form: The format of the question ('freeform' or 'yes_no').
- set_type: Specifies if the data point is part of the training, validation, or testing set.
- statement: The moral statement presented to participants, with ``<<NUM>>`` marking the number to be replaced.
- class_label: Numerical rating indicating the moral judgment (-1 negative, 0 neutral, and 1 positive).
- text_label: Textual version of the moral judgment (e.g., "It's understandable").
### Replacement Information
- list_span_start: A list of possible starting indices in the statement where the number span to be replaced could begin.
- list_span_end: A list of possible ending indices in the statement where the number span to be replaced could end.
- to_inf: A boolean (True/False) indicating whether the word "inf" (infinity) is a valid replacement option.
- not_modifiable: A boolean (True/False) indicating whether the number is meant to remain unchanged.
### IAA
- agreement: An agreement score ( Jaccard index: between 0.0 and 1.0) indicating consistency between different annotators who judged the same statement.
## Structure of ``annotations.json``
The annotations.json file contains a list of objects, each representing an annotator and their associated surveys. Here's a detailed breakdown of the structure:
- id (string): A unique identifier for the annotator.
- age (string): The age of the annotator.
- nation (string): The nation the annotator is from.
- religion (string): The religion of the annotator.
- education (string): The education level of the annotator.
- political (string): The political leaning of the annotator.
- gender (string): The gender of the annotator.
- surveys (array): A list of surveys completed by the annotator. Each survey is an object with the following fields:
- sid (string): A unique identifier for the survey.
- time (integer): The time taken to complete the survey.
- out_counter (float): A field related to the survey (exact meaning not provided).
- inf_counter (float): Another field related to the survey (exact meaning not provided).
- answers (object): An object where each key is a question identifier and the value is another object with the following fields:
- start (string): The start time for answering the question.
- end (string): The end time for answering the question.
## Structure of ``all_questions.csv``
The ``all_questions.csv`` file contains a list of questions that were extracted from the [Commonsense Norm Bank](https://arxiv.org/abs/2110.07574). Each row in the CSV file represents a single question and has the following columns:
- id: A unique identifier for each data entry. It has the following structure:
- Prefix:
- "ff" indicates a freeform question and
- "yn" indicates a yes-no question from the Commonsense Norm Bank.
- Separator: "x"
- Subset:
- "tr" for the training set.
- "te" for the test set.
- "va" for the validation set.
- Separator: "x"
- Subset ID: A unique numerical ID assigned within the specified subset.
- number_to_replace: The original number within the statement that participants are asked to replace.
- numeric_num: The original numeric value present in the statement (represented as a list in case of multiple numbers).
- form: The format of the question ('freeform' or 'yes_no').
- set_type: Specifies if the data point is part of the training, validation, or testing set.
- statement: The moral statement presented to participants, with ``<<NUM>>`` marking the number to be replaced.
- class_label: Numerical rating indicating the moral judgment (-1 negative, 0 neutral, and 1 positive).
- text_label: Textual version of the moral judgment (e.g., "It's understandable").
|
DL3DV/DL3DV-Benchmark | ---
tags:
- 3D vision
- novel view synthesis
- NeRF
- 3D Gaussian Splatting
- Generalizable NeRF
- Generative Methods
- text-to-3d
- image-to-3d
pretty_name: DL3DV
size_categories:
- n>1T
---
# DL3DV Benchmark Download Instructions
This repo contains all the benchmark data, including a README, License, colmaps/images (compatible to nerfstudio and 3D gaussian splatting), scene labels and the performances of methods reported in the paper (ZipNeRF, 3D GS, MipNeRF-360, nerfacto, Instant-NGP).
# Download
As the whole benchmark dataset is very big (~2.1T), we provide two ways to download: full benchmark dataset download or use a script to download a subset for memory sensitive cases.
## Full benchmark dataset download
If you have enough space (more than 2.1T), download the full benchmark is simple:
``` bash
# Make sure you have git-lfs installed
# (https://git-lfs.github.com/)
git lfs install
git clone https://huggingface.co/datasets/DL3DV/DL3DV-10K-Benchmark
```
## Script download
Sometimes you may just need to flexibly download a subset the benchmark, e.g. just download several scenes, or just need images with 960P resolution (images_4 level used in the paper). To provide this flexibiliy, we provide a [download.py](https://huggingface.co/datasets/DL3DV/DL3DV-10K-Benchmark/blob/main/download.py) script for use.
Use this [link](https://huggingface.co/datasets/DL3DV/DL3DV-10K-Benchmark/resolve/main/download.py?download=true) to download.
This download script provies several different options to use:
* Download the full dataset (which is equivalent to git clone method). In total 2.1T.
* Download the full dataset with only 960P images. In total 100~150G.
* Download with specific scene name (hash name)
### Environment Setup
The download script relies on `huggingface hub`, `tqdm`, and `pandas`. You can download by the following command in your python environment. The download script was
```bash
pip install huggingface_hub tqdm pandas
```
After downloading `huggingface_hub`, remember to login first to get ready for download.
```bash
# in terminal, use the following command and your huggingface token to login
huggingface-cli login
```
### Download the full benchmark
To download the full dataset, use this command:
``` bash
# Note, it is suggested to use --clean_cache flag as it saves space by cleaning the cache folder created by huggingface hub API.
python download.py --subset full --clean_cache
```
### Download the full benchmark with 960P resolution (same with the paper)
Not all the methods can handle multi-resolution. Some methods have assumptions on the input resolution. So the paper uses 960P.
``` bash
# Note, it is suggested to use --clean_cache flag as it saves space by cleaning the cache folder created by huggingface hub API.
python download.py --subset full --only_level4 --clean_cache
```
### Download with specific scene name (hash name)
There is a benchmark preview page in https://github.com/DL3DV-10K/Dataset. If you just need a specific hash (e.g. 0853979305f7ecb80bd8fc2c8df916410d471ef04ed5f1a64e9651baa41d7695), use the following command:
``` bash
# Note, it is suggested to use --clean_cache flag as it saves space by cleaning the cache folder created by huggingface hub API.
# e.g. a scene with hash 0853979305f7ecb80bd8fc2c8df916410d471ef04ed5f1a64e9651baa41d7695
python download.py --subset hash --hash 0853979305f7ecb80bd8fc2c8df916410d471ef04ed5f1a64e9651baa41d7695 --only_level4
``` |
EleutherAI/quirky_multiplication_raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: template_args
struct:
- name: character
dtype: string
- name: op1
dtype: int64
- name: op2
dtype: int64
- name: result
dtype: int64
- name: character
dtype: string
- name: label
dtype: bool
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: difficulty_quantile
dtype: float64
splits:
- name: train
num_bytes: 26256000
num_examples: 384000
- name: validation
num_bytes: 547000
num_examples: 8000
- name: test
num_bytes: 547000
num_examples: 8000
download_size: 13389837
dataset_size: 27350000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_elinas__chronos-mistral-7b | ---
pretty_name: Evaluation run of elinas/chronos-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elinas/chronos-mistral-7b](https://huggingface.co/elinas/chronos-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T06:39:41.464301](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-mistral-7b/blob/main/results_2024-04-09T06-39-41.464301.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4943261332869945,\n\
\ \"acc_stderr\": 0.03440242385841512,\n \"acc_norm\": 0.4990977698278415,\n\
\ \"acc_norm_stderr\": 0.035152578733964476,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.48059222372011373,\n\
\ \"mc2_stderr\": 0.014984088747615087\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985989,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128342\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5751842262497511,\n\
\ \"acc_stderr\": 0.004933047726996793,\n \"acc_norm\": 0.7719577773351922,\n\
\ \"acc_norm_stderr\": 0.004187124964848515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.030772653642075657,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.030772653642075657\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992062,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992062\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n\
\ \"acc_stderr\": 0.02794172734625631,\n \"acc_norm\": 0.5935483870967742,\n\
\ \"acc_norm_stderr\": 0.02794172734625631\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.03521224908841586,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03521224908841586\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846486,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911498,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911498\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526731,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526731\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829125,\n \"\
acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829125\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5991561181434599,\n \"acc_stderr\": 0.03190080389473235,\n \
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.03190080389473235\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.016232826818678492,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.016232826818678492\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.02827435985489424,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.02827435985489424\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n\
\ \"acc_stderr\": 0.012286991879902887,\n \"acc_norm\": 0.363754889178618,\n\
\ \"acc_norm_stderr\": 0.012286991879902887\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.03197694118713672,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.03197694118713672\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n\
\ \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n\
\ \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.48059222372011373,\n\
\ \"mc2_stderr\": 0.014984088747615087\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620296\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20849128127369218,\n \
\ \"acc_stderr\": 0.011189587985791425\n }\n}\n```"
repo_url: https://huggingface.co/elinas/chronos-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|winogrande|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T06-39-41.464301.parquet'
- config_name: results
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- results_2024-04-09T06-39-41.464301.parquet
- split: latest
path:
- results_2024-04-09T06-39-41.464301.parquet
---
# Dataset Card for Evaluation run of elinas/chronos-mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [elinas/chronos-mistral-7b](https://huggingface.co/elinas/chronos-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elinas__chronos-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T06:39:41.464301](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-mistral-7b/blob/main/results_2024-04-09T06-39-41.464301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4943261332869945,
"acc_stderr": 0.03440242385841512,
"acc_norm": 0.4990977698278415,
"acc_norm_stderr": 0.035152578733964476,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.48059222372011373,
"mc2_stderr": 0.014984088747615087
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985989,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.014512682523128342
},
"harness|hellaswag|10": {
"acc": 0.5751842262497511,
"acc_stderr": 0.004933047726996793,
"acc_norm": 0.7719577773351922,
"acc_norm_stderr": 0.004187124964848515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.030772653642075657,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.030772653642075657
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992062,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992062
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.02794172734625631,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.02794172734625631
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841586,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841586
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911498,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911498
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526731,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526731
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.020504729013829125,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.020504729013829125
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488418,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488418
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.03190080389473235,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.03190080389473235
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334384,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334384
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678492,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678492
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489424,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489424
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.363754889178618,
"acc_stderr": 0.012286991879902887,
"acc_norm": 0.363754889178618,
"acc_norm_stderr": 0.012286991879902887
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.03197694118713672,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.03197694118713672
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.48059222372011373,
"mc2_stderr": 0.014984088747615087
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620296
},
"harness|gsm8k|5": {
"acc": 0.20849128127369218,
"acc_stderr": 0.011189587985791425
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FINNUMBER/FINCH_TRAIN_NQA_300_per100_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 967292
num_examples: 300
download_size: 561994
dataset_size: 967292
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-electrical_engineering-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5473
num_examples: 5
- name: test
num_bytes: 275445
num_examples: 145
download_size: 13670
dataset_size: 280918
---
# Dataset Card for "mmlu-electrical_engineering-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_pmlb_100000_spambase_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2364400000
num_examples: 100000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 340594567
dataset_size: 2600840000
---
# Dataset Card for "autotree_pmlb_100000_spambase_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ssoh/mcq_dataset | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: Correct Answer
dtype: string
- name: Explanation
dtype: string
- name: formatted_chat
dtype: string
splits:
- name: train
num_bytes: 399116
num_examples: 334
- name: test
num_bytes: 49874
num_examples: 41
- name: val
num_bytes: 49114
num_examples: 43
download_size: 188636
dataset_size: 498104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
plaguss/ag_news_tutorial | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for ag_news_tutorial
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("plaguss/ag_news_tutorial")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("plaguss/ag_news_tutorial")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text from the article | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| label | In which category does this article fit? | label_selection | True | N/A | ['0', '1', '2', '3'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
**✨ NEW** The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": "record-0",
"fields": {
"text": "Wall St. Bears Claw Back Into the Black (Reuters) Reuters - Short-sellers, Wall Street\u0027s dwindling\\band of ultra-cynics, are seeing green again."
},
"metadata": {},
"responses": [],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": "record-0",
"label": [],
"label-suggestion": null,
"label-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"metadata": "{}",
"text": "Wall St. Bears Claw Back Into the Black (Reuters) Reuters - Short-sellers, Wall Street\u0027s dwindling\\band of ultra-cynics, are seeing green again."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **label** is of type `label_selection` with the following allowed values ['0', '1', '2', '3'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **label-suggestion** is of type `label_selection` with the following allowed values ['0', '1', '2', '3'].
Additionally, we also have two more fields that are optional and are the following:
* **✨ NEW** **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This dataset contains a collection of news articles. Please label them on the category they belong.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AI4EPS/quakeflow_sc | ---
license: mit
---
|
Siddish/change-my-view-subreddit-cleaned | ---
task_categories:
- text-generation
language:
- en
pretty_name: Opinionated LLM with r/CMV
size_categories:
- 1K<n<10K
---
# Opinionated LLM |
liuyanchen1015/MULTI_VALUE_qqp_corr_conjunction_doubling | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 373836
num_examples: 1607
- name: test
num_bytes: 3287571
num_examples: 14409
- name: train
num_bytes: 3437753
num_examples: 14522
download_size: 4206688
dataset_size: 7099160
---
# Dataset Card for "MULTI_VALUE_qqp_corr_conjunction_doubling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
evkes/llama-formatted-del | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1096696
num_examples: 649
download_size: 380097
dataset_size: 1096696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SuryaGrandhi/DLClassProjectData | ---
license: unknown
---
|
CyberHarem/yatadera_narumi_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yatadera_narumi/矢田寺成美 (Touhou)
This is the dataset of yatadera_narumi/矢田寺成美 (Touhou), containing 11 images and their tags.
The core tags of this character are `black_hair, braid, hat, long_hair, twin_braids, bangs, red_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 15.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 27 | 18.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 27 | 23.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yatadera_narumi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yatadera_narumi_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, ajirogasa, grey_dress, long_sleeves, solo, red_capelet, buttons, looking_at_viewer, clothes_writing, smile, long_earlobes, own_hands_together, snowing, blush, open_mouth, closed_mouth, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ajirogasa | grey_dress | long_sleeves | solo | red_capelet | buttons | looking_at_viewer | clothes_writing | smile | long_earlobes | own_hands_together | snowing | blush | open_mouth | closed_mouth | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------------|:---------------|:-------|:--------------|:----------|:--------------------|:------------------|:--------|:----------------|:---------------------|:----------|:--------|:-------------|:---------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
tyzhu/squad_qa_wrong_title_v5_full_no_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7855838.683639287
num_examples: 4778
- name: validation
num_bytes: 361864
num_examples: 300
download_size: 1219794
dataset_size: 8217702.683639287
---
# Dataset Card for "squad_qa_wrong_title_v5_full_no_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shahidul034/text_summarization_dataset8 | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 126184009
num_examples: 101745
download_size: 44181954
dataset_size: 126184009
---
# Dataset Card for "text_summarization_dataset8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KE-AI/text-gen | ---
task_categories:
- text-generation
- text2text-generation
- conversational
---
# Kroh:
Tonas `dataset_kel.txt`.
<br>
Tas tehst kroh:
<br>
`Tehst`→`, ant tehst nymer la.\nTehst ala ton.` |
open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-7b-chat-ckpt-hf | ---
pretty_name: Evaluation run of openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-7b-chat-ckpt-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T21:18:42.609211](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-7b-chat-ckpt-hf/blob/main/results_2023-12-29T21-18-42.609211.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3651457377147079,\n\
\ \"acc_stderr\": 0.0337649318691844,\n \"acc_norm\": 0.36947752907373566,\n\
\ \"acc_norm_stderr\": 0.03463087989078143,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4999073626978088,\n\
\ \"mc2_stderr\": 0.015580803887648534\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41552901023890787,\n \"acc_stderr\": 0.014401366641216383,\n\
\ \"acc_norm\": 0.4496587030716723,\n \"acc_norm_stderr\": 0.01453714444428473\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5032861979685321,\n\
\ \"acc_stderr\": 0.004989673640014256,\n \"acc_norm\": 0.7018522206731727,\n\
\ \"acc_norm_stderr\": 0.004565098421085231\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.35471698113207545,\n \"acc_stderr\": 0.029445175328199586,\n\
\ \"acc_norm\": 0.35471698113207545,\n \"acc_norm_stderr\": 0.029445175328199586\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.3179190751445087,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761926,\n\
\ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761926\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.35172413793103446,\n \"acc_stderr\": 0.03979236637497411,\n\
\ \"acc_norm\": 0.35172413793103446,\n \"acc_norm_stderr\": 0.03979236637497411\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4032258064516129,\n\
\ \"acc_stderr\": 0.02790615082604114,\n \"acc_norm\": 0.4032258064516129,\n\
\ \"acc_norm_stderr\": 0.02790615082604114\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.47878787878787876,\n \"acc_stderr\": 0.03900828913737302,\n\
\ \"acc_norm\": 0.47878787878787876,\n \"acc_norm_stderr\": 0.03900828913737302\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.40404040404040403,\n \"acc_stderr\": 0.03496130972056127,\n \"\
acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.03496130972056127\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.43005181347150256,\n \"acc_stderr\": 0.035729543331448066,\n\
\ \"acc_norm\": 0.43005181347150256,\n \"acc_norm_stderr\": 0.035729543331448066\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2923076923076923,\n \"acc_stderr\": 0.023060438380857744,\n\
\ \"acc_norm\": 0.2923076923076923,\n \"acc_norm_stderr\": 0.023060438380857744\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3926605504587156,\n \"acc_stderr\": 0.020937505161201096,\n \"\
acc_norm\": 0.3926605504587156,\n \"acc_norm_stderr\": 0.020937505161201096\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.03465868196380757,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.03465868196380757\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4978902953586498,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.4978902953586498,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3991031390134529,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.3991031390134529,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n\
\ \"acc_stderr\": 0.047323326159788154,\n \"acc_norm\": 0.39814814814814814,\n\
\ \"acc_norm_stderr\": 0.047323326159788154\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.37423312883435583,\n \"acc_stderr\": 0.038020681028996146,\n\
\ \"acc_norm\": 0.37423312883435583,\n \"acc_norm_stderr\": 0.038020681028996146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.47435897435897434,\n\
\ \"acc_stderr\": 0.03271298896811159,\n \"acc_norm\": 0.47435897435897434,\n\
\ \"acc_norm_stderr\": 0.03271298896811159\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.44189016602809705,\n\
\ \"acc_stderr\": 0.017758800534214424,\n \"acc_norm\": 0.44189016602809705,\n\
\ \"acc_norm_stderr\": 0.017758800534214424\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3786127167630058,\n \"acc_stderr\": 0.026113749361310334,\n\
\ \"acc_norm\": 0.3786127167630058,\n \"acc_norm_stderr\": 0.026113749361310334\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225612,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.34967320261437906,\n \"acc_stderr\": 0.0273053080762747,\n\
\ \"acc_norm\": 0.34967320261437906,\n \"acc_norm_stderr\": 0.0273053080762747\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n\
\ \"acc_stderr\": 0.027950481494401266,\n \"acc_norm\": 0.4115755627009646,\n\
\ \"acc_norm_stderr\": 0.027950481494401266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.026571483480719978,\n\
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.026571483480719978\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3155149934810952,\n\
\ \"acc_stderr\": 0.011869184843058643,\n \"acc_norm\": 0.3155149934810952,\n\
\ \"acc_norm_stderr\": 0.011869184843058643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4019607843137255,\n \"acc_stderr\": 0.019835176484375373,\n \
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.019835176484375373\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\
\ \"acc_stderr\": 0.04631381319425463,\n \"acc_norm\": 0.37272727272727274,\n\
\ \"acc_norm_stderr\": 0.04631381319425463\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.02939360931987981,\n\
\ \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.02939360931987981\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.39303482587064675,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.39303482587064675,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.52046783625731,\n \"acc_stderr\": 0.0383161053282193,\n\
\ \"acc_norm\": 0.52046783625731,\n \"acc_norm_stderr\": 0.0383161053282193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4999073626978088,\n\
\ \"mc2_stderr\": 0.015580803887648534\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6937647987371744,\n \"acc_stderr\": 0.012954385972802462\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \
\ \"acc_stderr\": 0.0031957470754808283\n }\n}\n```"
repo_url: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|arc:challenge|25_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|gsm8k|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hellaswag|10_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-26-49.538112.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-18-42.609211.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-18-42.609211.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- '**/details_harness|winogrande|5_2023-12-27T14-26-49.538112.parquet'
- split: 2023_12_29T21_18_42.609211
path:
- '**/details_harness|winogrande|5_2023-12-29T21-18-42.609211.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T21-18-42.609211.parquet'
- config_name: results
data_files:
- split: 2023_12_27T14_26_49.538112
path:
- results_2023-12-27T14-26-49.538112.parquet
- split: 2023_12_29T21_18_42.609211
path:
- results_2023-12-29T21-18-42.609211.parquet
- split: latest
path:
- results_2023-12-29T21-18-42.609211.parquet
---
# Dataset Card for Evaluation run of openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-7b-chat-ckpt-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:18:42.609211](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-7b-chat-ckpt-hf/blob/main/results_2023-12-29T21-18-42.609211.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3651457377147079,
"acc_stderr": 0.0337649318691844,
"acc_norm": 0.36947752907373566,
"acc_norm_stderr": 0.03463087989078143,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4999073626978088,
"mc2_stderr": 0.015580803887648534
},
"harness|arc:challenge|25": {
"acc": 0.41552901023890787,
"acc_stderr": 0.014401366641216383,
"acc_norm": 0.4496587030716723,
"acc_norm_stderr": 0.01453714444428473
},
"harness|hellaswag|10": {
"acc": 0.5032861979685321,
"acc_stderr": 0.004989673640014256,
"acc_norm": 0.7018522206731727,
"acc_norm_stderr": 0.004565098421085231
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.35471698113207545,
"acc_stderr": 0.029445175328199586,
"acc_norm": 0.35471698113207545,
"acc_norm_stderr": 0.029445175328199586
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.35172413793103446,
"acc_stderr": 0.03979236637497411,
"acc_norm": 0.35172413793103446,
"acc_norm_stderr": 0.03979236637497411
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4032258064516129,
"acc_stderr": 0.02790615082604114,
"acc_norm": 0.4032258064516129,
"acc_norm_stderr": 0.02790615082604114
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782426,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782426
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.47878787878787876,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.47878787878787876,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.40404040404040403,
"acc_stderr": 0.03496130972056127,
"acc_norm": 0.40404040404040403,
"acc_norm_stderr": 0.03496130972056127
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43005181347150256,
"acc_stderr": 0.035729543331448066,
"acc_norm": 0.43005181347150256,
"acc_norm_stderr": 0.035729543331448066
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2923076923076923,
"acc_stderr": 0.023060438380857744,
"acc_norm": 0.2923076923076923,
"acc_norm_stderr": 0.023060438380857744
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3926605504587156,
"acc_stderr": 0.020937505161201096,
"acc_norm": 0.3926605504587156,
"acc_norm_stderr": 0.020937505161201096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.03465868196380757,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.03465868196380757
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4978902953586498,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.4978902953586498,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3991031390134529,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.3991031390134529,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.3511450381679389,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.047323326159788154,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.047323326159788154
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.37423312883435583,
"acc_stderr": 0.038020681028996146,
"acc_norm": 0.37423312883435583,
"acc_norm_stderr": 0.038020681028996146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.03271298896811159,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.03271298896811159
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.44189016602809705,
"acc_stderr": 0.017758800534214424,
"acc_norm": 0.44189016602809705,
"acc_norm_stderr": 0.017758800534214424
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3786127167630058,
"acc_stderr": 0.026113749361310334,
"acc_norm": 0.3786127167630058,
"acc_norm_stderr": 0.026113749361310334
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225612,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.34967320261437906,
"acc_stderr": 0.0273053080762747,
"acc_norm": 0.34967320261437906,
"acc_norm_stderr": 0.0273053080762747
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.027950481494401266,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.027950481494401266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.026571483480719978,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.026571483480719978
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169927,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169927
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3155149934810952,
"acc_stderr": 0.011869184843058643,
"acc_norm": 0.3155149934810952,
"acc_norm_stderr": 0.011869184843058643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33088235294117646,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.33088235294117646,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.019835176484375373,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.019835176484375373
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425463,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425463
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3020408163265306,
"acc_stderr": 0.02939360931987981,
"acc_norm": 0.3020408163265306,
"acc_norm_stderr": 0.02939360931987981
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.39303482587064675,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.39303482587064675,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.52046783625731,
"acc_stderr": 0.0383161053282193,
"acc_norm": 0.52046783625731,
"acc_norm_stderr": 0.0383161053282193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4999073626978088,
"mc2_stderr": 0.015580803887648534
},
"harness|winogrande|5": {
"acc": 0.6937647987371744,
"acc_stderr": 0.012954385972802462
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754808283
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
henrydz/ocr | ---
license: apache-2.0
---
|
d0rj/dialogsum-ru | ---
annotations_creators:
- expert-generated
language_creators:
- translated
language:
- ru
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- knkarthick/dialogsum
task_categories:
- summarization
- text2text-generation
- text-generation
task_ids: []
pretty_name: DIALOGSum Corpus (ru)
tags:
- conversations-summarization
- dialogue-summarization
dataset_info:
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 19115158
num_examples: 12460
- name: validation
num_bytes: 746312
num_examples: 500
- name: test
num_bytes: 2282379
num_examples: 1500
download_size: 10144708
dataset_size: 22143849
train-eval-index:
- config: samsum
task: summarization
task_id: summarization
splits:
eval_split: test
col_mapping:
dialogue: text
summary: target
---
# Dataset Card for DIALOGSum Corpus
## Dataset Description
### Links
- **Homepage:** https://aclanthology.org/2021.findings-acl.449
- **Repository:** https://github.com/cylnlp/dialogsum
- **Paper:** https://aclanthology.org/2021.findings-acl.449
### Dataset Summary
DialogSum is a large-scale dialogue summarization dataset, consisting of 13,460 (Plus 100 holdout data for topic generation) dialogues with corresponding manually labeled summaries and topics.
### Languages
Russian (translated from English by Google Translate).
## Dataset Structure
### Data Fields
- dialogue: text of dialogue.
- summary: human written summary of the dialogue.
- topic: human written topic/one liner of the dialogue.
- id: unique file id of an example.
### Data Splits
- train: 12460
- val: 500
- test: 1500
- holdout: 100 [Only 3 features: id, dialogue, topic]
## Dataset Creation
### Curation Rationale
In paper:
We collect dialogue data for DialogSum from three public dialogue corpora, namely Dailydialog (Li et al., 2017), DREAM (Sun et al., 2019) and MuTual (Cui et al., 2019), as well as an English speaking practice website. These datasets contain face-to-face spoken dialogues that cover a wide range of daily-life topics, including schooling, work, medication, shopping, leisure, travel. Most conversations take place between friends, colleagues, and between service providers and customers.
Compared with previous datasets, dialogues from DialogSum have distinct characteristics:
Under rich real-life scenarios, including more diverse task-oriented scenarios;
Have clear communication patterns and intents, which is valuable to serve as summarization sources;
Have a reasonable length, which comforts the purpose of automatic summarization.
We ask annotators to summarize each dialogue based on the following criteria:
Convey the most salient information;
Be brief;
Preserve important named entities within the conversation;
Be written from an observer perspective;
Be written in formal language.
### Who are the source language producers?
linguists
### Who are the annotators?
language experts
## Licensing Information
MIT License
## Citation Information
```
@inproceedings{chen-etal-2021-dialogsum,
title = "{D}ialog{S}um: {A} Real-Life Scenario Dialogue Summarization Dataset",
author = "Chen, Yulong and
Liu, Yang and
Chen, Liang and
Zhang, Yue",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.449",
doi = "10.18653/v1/2021.findings-acl.449",
pages = "5062--5074",
```
## Contributions
Thanks to [@cylnlp](https://github.com/cylnlp) for adding this dataset. |
open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b | ---
pretty_name: Evaluation run of migtissera/Tess-34B-v1.5b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Tess-34B-v1.5b](https://huggingface.co/migtissera/Tess-34B-v1.5b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T23:12:19.798626](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b/blob/main/results_2024-01-28T23-12-19.798626.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7571140160311144,\n\
\ \"acc_stderr\": 0.028404294310283486,\n \"acc_norm\": 0.7619075094631577,\n\
\ \"acc_norm_stderr\": 0.028933953410883263,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5312154281103626,\n\
\ \"mc2_stderr\": 0.015485998460539758\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979277,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175449\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6555467038438558,\n\
\ \"acc_stderr\": 0.004742185169264772,\n \"acc_norm\": 0.8442541326428998,\n\
\ \"acc_norm_stderr\": 0.0036187316588377205\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.025447863825108594,\n\
\ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.025447863825108594\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.028919802956134905,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.028919802956134905\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n\
\ \"acc_stderr\": 0.033917503223216586,\n \"acc_norm\": 0.7283236994219653,\n\
\ \"acc_norm_stderr\": 0.033917503223216586\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838746,\n\
\ \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838746\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6640211640211641,\n \"acc_stderr\": 0.024326310529149145,\n \"\
acc_norm\": 0.6640211640211641,\n \"acc_norm_stderr\": 0.024326310529149145\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.9225806451612903,\n \"acc_stderr\": 0.015203644420774848,\n\
\ \"acc_norm\": 0.9225806451612903,\n \"acc_norm_stderr\": 0.015203644420774848\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6798029556650246,\n \"acc_stderr\": 0.032826493853041504,\n \"\
acc_norm\": 0.6798029556650246,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527034,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527034\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.019565236782930883,\n\
\ \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.019565236782930883\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \
\ \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707952,\n\
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707952\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571743,\n \"\
acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571743\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9240506329113924,\n \"acc_stderr\": 0.0172446332510657,\n \
\ \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.0172446332510657\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869622,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869622\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n\
\ \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253876,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253876\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n\
\ \"acc_stderr\": 0.01039741708729285,\n \"acc_norm\": 0.9067688378033205,\n\
\ \"acc_norm_stderr\": 0.01039741708729285\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.02090397584208303,\n\
\ \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.02090397584208303\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7229050279329609,\n\
\ \"acc_stderr\": 0.01496877243581214,\n \"acc_norm\": 0.7229050279329609,\n\
\ \"acc_norm_stderr\": 0.01496877243581214\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.020645597910418763,\n\
\ \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.020645597910418763\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n\
\ \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.8070739549839229,\n\
\ \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062072,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062072\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6134751773049646,\n \"acc_stderr\": 0.029049190342543465,\n \
\ \"acc_norm\": 0.6134751773049646,\n \"acc_norm_stderr\": 0.029049190342543465\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5958279009126467,\n\
\ \"acc_stderr\": 0.012533504046491367,\n \"acc_norm\": 0.5958279009126467,\n\
\ \"acc_norm_stderr\": 0.012533504046491367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.02257177102549475,\n\
\ \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.02257177102549475\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8251633986928104,\n \"acc_stderr\": 0.015366167064780655,\n \
\ \"acc_norm\": 0.8251633986928104,\n \"acc_norm_stderr\": 0.015366167064780655\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585633,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585633\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5963855421686747,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.5963855421686747,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5312154281103626,\n\
\ \"mc2_stderr\": 0.015485998460539758\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.01095971643524291\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6285064442759667,\n \
\ \"acc_stderr\": 0.01330983907570649\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Tess-34B-v1.5b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|arc:challenge|25_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|gsm8k|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hellaswag|10_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-19.798626.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- '**/details_harness|winogrande|5_2024-01-28T23-12-19.798626.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T23-12-19.798626.parquet'
- config_name: results
data_files:
- split: 2024_01_28T23_12_19.798626
path:
- results_2024-01-28T23-12-19.798626.parquet
- split: latest
path:
- results_2024-01-28T23-12-19.798626.parquet
---
# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.5b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/Tess-34B-v1.5b](https://huggingface.co/migtissera/Tess-34B-v1.5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T23:12:19.798626](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b/blob/main/results_2024-01-28T23-12-19.798626.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7571140160311144,
"acc_stderr": 0.028404294310283486,
"acc_norm": 0.7619075094631577,
"acc_norm_stderr": 0.028933953410883263,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5312154281103626,
"mc2_stderr": 0.015485998460539758
},
"harness|arc:challenge|25": {
"acc": 0.6177474402730375,
"acc_stderr": 0.014200454049979277,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175449
},
"harness|hellaswag|10": {
"acc": 0.6555467038438558,
"acc_stderr": 0.004742185169264772,
"acc_norm": 0.8442541326428998,
"acc_norm_stderr": 0.0036187316588377205
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.025447863825108594,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.025447863825108594
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.028919802956134905,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.028919802956134905
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.033917503223216586,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.033917503223216586
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.026556982117838746,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.026556982117838746
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6403508771929824,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.6403508771929824,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6640211640211641,
"acc_stderr": 0.024326310529149145,
"acc_norm": 0.6640211640211641,
"acc_norm_stderr": 0.024326310529149145
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9225806451612903,
"acc_stderr": 0.015203644420774848,
"acc_norm": 0.9225806451612903,
"acc_norm_stderr": 0.015203644420774848
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6798029556650246,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.6798029556650246,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019949,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019949
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527034,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527034
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.019565236782930883,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.019565236782930883
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654002,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654002
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707952,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707952
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571743,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9240506329113924,
"acc_stderr": 0.0172446332510657,
"acc_norm": 0.9240506329113924,
"acc_norm_stderr": 0.0172446332510657
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869622,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869622
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331356,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253876,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253876
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.01039741708729285,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.01039741708729285
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.02090397584208303,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.02090397584208303
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7229050279329609,
"acc_stderr": 0.01496877243581214,
"acc_norm": 0.7229050279329609,
"acc_norm_stderr": 0.01496877243581214
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.020645597910418763,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.020645597910418763
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8070739549839229,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.8070739549839229,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062072,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062072
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6134751773049646,
"acc_stderr": 0.029049190342543465,
"acc_norm": 0.6134751773049646,
"acc_norm_stderr": 0.029049190342543465
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5958279009126467,
"acc_stderr": 0.012533504046491367,
"acc_norm": 0.5958279009126467,
"acc_norm_stderr": 0.012533504046491367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.02257177102549475,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.02257177102549475
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8251633986928104,
"acc_stderr": 0.015366167064780655,
"acc_norm": 0.8251633986928104,
"acc_norm_stderr": 0.015366167064780655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585633,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585633
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5963855421686747,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.5963855421686747,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5312154281103626,
"mc2_stderr": 0.015485998460539758
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.01095971643524291
},
"harness|gsm8k|5": {
"acc": 0.6285064442759667,
"acc_stderr": 0.01330983907570649
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-glue-mrpc-e15d1b-14665997 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: Intel/camembert-base-mrpc
metrics: []
dataset_name: glue
dataset_config: mrpc
dataset_split: validation
col_mapping:
text1: sentence1
text2: sentence2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: Intel/camembert-base-mrpc
* Dataset: glue
* Config: mrpc
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/e11a2ce6 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1341
dataset_size: 188
---
# Dataset Card for "e11a2ce6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DianaJin/march | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 31704448
num_examples: 33
- name: test
num_bytes: 4803496
num_examples: 5
- name: valid
num_bytes: 3842480
num_examples: 4
download_size: 13908387
dataset_size: 40350424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
SEACrowd/imdb_jv | ---
license: unknown
tags:
- sentiment-analysis
language:
- ind
---
# imdb_jv
Javanese Imdb Movie Reviews Dataset is a Javanese version of the IMDb Movie Reviews dataset by translating the original English dataset to Javanese.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{wongso2021causal,
title={Causal and masked language modeling of Javanese language using transformer-based architectures},
author={Wongso, Wilson and Setiawan, David Samuel and Suhartono, Derwin},
booktitle={2021 International Conference on Advanced Computer Science and Information Systems (ICACSIS)},
pages={1--7},
year={2021},
organization={IEEE}
}
```
## License
Unknown
## Homepage
[https://huggingface.co/datasets/w11wo/imdb-javanese](https://huggingface.co/datasets/w11wo/imdb-javanese)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
Nasimnewcode/Tree_species | ---
dataset_info:
features:
- name: image
dtype: string
- name: label
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 578082
num_examples: 3949
download_size: 0
dataset_size: 578082
---
# Dataset Card for "Tree_species"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FudanSELab/SO_KGXQR_DOCUMENT | ---
dataset_info:
- config_name: document_store_csharp
features:
- name: Id
dtype: int64
- name: Score
dtype: int64
- name: Title
dtype: string
- name: Tags
dtype: string
- name: Answer_score
dtype: int64
splits:
- name: test
num_bytes: 10032065
num_examples: 87030
download_size: 5446977
dataset_size: 10032065
- config_name: document_store_java
features:
- name: Id
dtype: int64
- name: Score
dtype: int64
- name: Title
dtype: string
- name: Tags
dtype: string
- name: Answer_score
dtype: int64
splits:
- name: test
num_bytes: 10015417
num_examples: 86531
download_size: 5476703
dataset_size: 10015417
- config_name: document_store_javascript
features:
- name: Id
dtype: int64
- name: Score
dtype: int64
- name: Title
dtype: string
- name: Tags
dtype: string
- name: Answer_score
dtype: int64
splits:
- name: test
num_bytes: 9368108
num_examples: 79091
download_size: 4701275
dataset_size: 9368108
- config_name: document_store_python
features:
- name: Id
dtype: int64
- name: Score
dtype: int64
- name: Title
dtype: string
- name: Tags
dtype: string
- name: Answer_score
dtype: int64
splits:
- name: test
num_bytes: 9326461
num_examples: 81072
download_size: 4929374
dataset_size: 9326461
configs:
- config_name: document_store_csharp
data_files:
- split: test
path: document_store_csharp/test-*
- config_name: document_store_java
data_files:
- split: test
path: document_store_java/test-*
- config_name: document_store_javascript
data_files:
- split: test
path: document_store_javascript/test-*
- config_name: document_store_python
data_files:
- split: test
path: document_store_python/test-*
license: mit
size_categories:
- 100K<n<1M
language:
- en
---
# Dataset Card for "SO_KGXQR_DOCUMENT"
## Dataset Description
- **Repository:** [GitHub Repository](https://kgxqr.github.io/)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
felipesampaio2010/uckermanndataset | ---
license: openrail
---
|
Salvatale/test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 20392
num_examples: 25
download_size: 19229
dataset_size: 20392
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Werli/Shareg | ---
license: apache-2.0
---
|
knowledgator/biomed_NER | ---
license: apache-2.0
task_categories:
- token-classification
language:
- en
tags:
- biomed NER
- PubMed NER
- biology
- medicine
- NER
- entity extraction
pretty_name: biomed-ner
size_categories:
- 1K<n<10K
---
### BioMed_general_NER
This dataset consists of manually annotated biomedical abstracts from PubMed, drug descriptions from FDA and abstracts from patents.
It was extracted 24 different entity types, including those specific to medicine and biology and general such as location and organization as well.
This is one of the biggest datasets of such kind, which consists of 4840 annotated abstracts.
### Classes
Here's a description for each of the labels:
1. **CHEMICALS** - Represents substances with distinct molecular composition, often involved in various biological or industrial processes.
2. **CLINICAL DRUG** - Refers to pharmaceutical substances developed for medical use, aimed at preventing, treating, or managing diseases.
3. **BODY SUBSTANCE** - Denotes materials or substances within the human body, including fluids, tissues, and other biological components.
4. **ANATOMICAL STRUCTURE** - Describes specific parts or structures within an organism's body, often related to anatomy and physiology.
5. **CELLS AND THEIR COMPONENTS** - Encompasses the basic structural and functional units of living organisms, along with their constituent elements.
6. **GENE AND GENE PRODUCTS** - Involves genetic information and the resultant products, such as proteins, that play a crucial role in biological processes.
7. **INTELLECTUAL PROPERTY** - Pertains to legal rights associated with creations of the mind, including inventions, literary and artistic works, and trademarks.
8. **LANGUAGE** - Relates to linguistic elements, including words, phrases, and language constructs, often in the context of communication or analysis.
9. **REGULATION OR LAW** - Represents rules, guidelines, or legal frameworks established by authorities to govern behavior, practices, or procedures.
10. **GEOGRAPHICAL AREAS** - Refers to specific regions, locations, or places on the Earth's surface, often associated with particular characteristics or significance.
11. **ORGANISM** - Denotes a living being, typically a plant, animal, or microorganism, as a distinct biological entity.
12. **GROUP** - Encompasses collections of individuals with shared characteristics, interests, or affiliations.
13. **PERSON** - Represents an individual human being, often considered as a distinct entity with personal attributes.
14. **ORGANIZATION** - Refers to structured entities, institutions, or companies formed for specific purposes or activities.
15. **PRODUCT** - Encompasses tangible or intangible items resulting from a process, often associated with manufacturing or creation.
16. **LOCATION** - Describes a specific place or position, whether physical or abstract, with potential relevance to various contexts.
17. **PHENOTYPE** - Represents the observable characteristics or traits of an organism, resulting from the interaction of its genotype with the environment.
18. **DISORDER** - Denotes abnormal conditions or disruptions in the normal functioning of a biological organism, often associated with diseases or medical conditions.
19. **SIGNALING MOLECULES** - Involves molecules that transmit signals within and between cells, playing a crucial role in various physiological processes.
20. **EVENT** - Describes occurrences or happenings at a specific time and place, often with significance or impact.
21. **MEDICAL PROCEDURE** - Involves specific actions or interventions conducted for medical purposes, such as surgeries, diagnostic tests, or therapeutic treatments.
22. **ACTIVITY** - Encompasses actions, behaviors, or processes undertaken by individuals, groups, or entities.
23. **FUNCTION** - Describes the purpose or role of a biological or mechanical entity, focusing on its intended or inherent activities.
24. **MONEY** - Represents currency or financial assets used as a medium of exchange, often in the context of economic transactions.
### Datasources
* PubMed - biomedical articles abstracts;
* FDA - drugs descriptions;
* Patents - patents abstracts;
|
dmayhem93/self-critiquing-base | ---
dataset_info:
features:
- name: id
dtype: string
- name: split
dtype: string
- name: time
dtype: float64
- name: labeler
dtype: string
- name: is_topic_based_summarization
dtype: bool
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 229932964
num_examples: 47017
- name: test
num_bytes: 73005699
num_examples: 10647
download_size: 55618766
dataset_size: 302938663
---
# Dataset Card for "self-critiquing-base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-0d414f0c-bce8-44f6-9c83-f356bfaf679d-1412 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
316usman/thematic5d_rr_embed | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 60575838
num_examples: 95157
download_size: 22323460
dataset_size: 60575838
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-tweet_eval-offensive-f58805-30720144956 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- tweet_eval
eval_info:
task: multi_class_classification
model: cardiffnlp/twitter-roberta-base-2021-124m-offensive
metrics: ['bertscore']
dataset_name: tweet_eval
dataset_config: offensive
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: cardiffnlp/twitter-roberta-base-2021-124m-offensive
* Dataset: tweet_eval
* Config: offensive
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@fabeelaalirawther@gmail.com](https://huggingface.co/fabeelaalirawther@gmail.com) for evaluating this model. |
leeseungyeul/lawmean_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8453
num_examples: 135
download_size: 5700
dataset_size: 8453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
allenai/tulu-v1-sft-mixture | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1195802237
num_examples: 489818
download_size: 540343943
dataset_size: 1195802237
license: odc-by
task_categories:
- question-answering
- conversational
- text-generation
language:
- en
size_categories:
- 100K<n<1M
---
# Dataset Card for Tulu Instruction Mix
**For a newer version, see [Tulu V2](https://huggingface.co/datasets/allenai/tulu-v2)**
This version, the human data mixture, dataset consists of a mix of:
* [FLAN](https://github.com/google-research/FLAN/tree/main) (Apache 2.0): FLAN v2 with CoT examples (most of the tasks in SuperNatural Instructions are included here)
* [Open Assistant 1](https://huggingface.co/datasets/OpenAssistant/oasst1) (Apache 2.0)
* [Dolly](https://huggingface.co/datasets/databricks/databricks-dolly-15k) (CC By SA 3.0)
* [ShareGPT](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered) (Apache 2.0 listed, no official repo found)
* [GPT4-Alpaca](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM#data-release) (CC By NC 4.0)
* [Code-Alpaca](https://github.com/sahil280114/codealpaca) (CC By NC 4.0)
These are made by taking either just the training set of the subsets or the entire section if no splits are present.
For more information, see the paper [How Far Can Camels Go? Exploring the State of Instruction Tuning on Open Resources
](https://arxiv.org/abs/2306.04751).
### License
We are releasing this dataset under the terms of [ODC-BY](https://opendatacommons.org/licenses/by/1-0/). By using this, you are also bound by the [Common Crawl terms of use](https://commoncrawl.org/terms-of-use/) in respect of the content contained in the dataset.
|
wantswanda/chinese | ---
task_categories:
- image-classification
language:
- en
pretty_name: chinese_characters
size_categories:
- 1K<n<10K
--- |
GEM-submissions/lewtun__this-is-a-test-name__1648048960 | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name
|
julia-lukasiewicz-pater/small-GPT-wiki-intro-features | ---
license: cc
task_categories:
- text-classification
language:
- en
size_categories:
- 10K<n<100K
---
# Small-GPT-wiki-intro-features dataset
This dataset is based on [aadityaubhat/GPT-wiki-intro](https://huggingface.co/datasets/aadityaubhat/GPT-wiki-intro).
It contains 100k randomly selected texts (50k from Wikipedia and 50k generated by ChatGPT).
For each text, various complexity measures were calculated, including e.g. readibility, lexical richness etc.
It can be used for text classification or analysis of linguistic features of human-generated and ChatGPT-generated texts.
## Dataset structure
Features were calculated using various Python libraries, i.e. NLTK, [readability-metrics](https://pypi.org/project/py-readability-metrics/), [lexical-diversity](https://pypi.org/project/lexical-diversity/),
and [TextDescriptives](https://hlasse.github.io/TextDescriptives/). The list of all features and their corresponding sources can be found below:
| Column | Description |
| ------ | ----------- |
| text | human- or ChatGPT-generated text; taken from aadityaubhat/GPT-wiki-intro |
| normalized_bigram_entropy | bigram entropy normalized with estimated maximum entropy; nltk |
| mean_word_length | mean word length; nltk |
| mean_sent_length | mean sentence length; nltk |
| fog | Gunning-Fog; readability-metrics |
| ari | Automated Readability Index; readability-metrics |
| dale_chall | Dale Chall Readability; readability-metrics |
| hdd | Hypergeometric Distribution; lexical-diversity |
| mtld | Measure of lexical textual diversity; lexical-diversity |
| mattr | Moving average type-token ratio; lexical-diversity |
| number_of_ADJ | proportion of adjectives per word; nltk |
| number_of_ADP | proportion of adpositions per word; nltk |
| number_of_ADV | proportion of adverbs per word; nltk |
| number_of_CONJ | proportion of conjunctions per word; nltk |
| number_of_DET | proportion of determiners per word; nltk |
| number_of_NOUN | proportion of nouns per word; nltk |
| number_of_NUM | proportion of numerals per word; nltk |
| number_of_PRT | proportion of particles per word; nltk |
| number_of_PRON | proportion of pronuns per word; nltk |
| number_of_VERB | proportion of verbs per word; nltk |
| number_of_DOT | proportion of punctuation marks per word; nltk |
| number_of_X | proportion of POS tag 'Other' per word; nltk |
| class | binary class, 0 stands for Wikipedia, 1 stands for ChatGPT |
| spacy_perplexity | text perplexity; TextDescriptives |
| entropy | text entropy; TextDescriptives |
| automated_readability_index | Automated Readability Index; TextDescriptives |
| per_word_spacy_perplexity | text perplexity per word; TextDescriptives |
| dependency_distance_mean | mean distance from each token to their dependent; TextDescriptives |
| dependency_distance_std | standard deviation of distance from each token to their dependent; TextDescriptives |
| first_order_coherence | cosine similarity between consecutive sentences; TextDescriptives |
| second_order_coherence | cosine similarity between sentences that are two sentences apart; TextDescriptives |
| smog |SMOG; TextDescriptives |
| prop_adjacent_dependency_relation_mean | mean proportion adjacent dependency relations; TextDescriptives |
| prop_adjacent_dependency_relation_std | standard deviation of proportion adjacent dependency relations; TextDescriptives |
| syllables_per_token_mean | mean of syllables per token; TextDescriptives |
| syllables_per_token_median | median of syllables per token; TextDescriptives |
| token_length_std | standard deviation of token length; TextDescriptives |
| token_length_median | median of token length; TextDescriptives |
| sentence_length_median | median of sentence length; TextDescriptives |
| syllables_per_token_std | standard deviation of syllables per token; TextDescriptives |
| proportion_unique_tokens | proportion of unique tokens; TextDescriptives |
| top_ngram_chr_fraction_3 | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| top_ngram_chr_fraction_2 | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| top_ngram_chr_fraction_4 | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| proportion_bullet_points | fraction of characters in a document which are contained within the top n-grams. For a specified n-gram range; TextDescriptives |
| flesch_reading_ease | Flesch Reading ease ; TextDescriptives |
| flesch_kincaid_grade | Flesch Kincaid grade; TextDescriptives |
| gunning_fog | Gunning-Fog; TextDescriptives |
| coleman_liau_index | Coleman-Liau Index; TextDescriptives |
| oov_ratio| out-of-vocabulary ratio; TextDescriptives |
## Code
Code that was used to generate this dataset can be found on [Github](https://github.com/julia-lukasiewicz-pater/gpt-wiki-features/tree/main).
|
autoevaluate/autoeval-staging-eval-project-samsum-07954c9f-11065483 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/led-large-book-summary
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/led-large-book-summary
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
Vageesh1/malicious_smart_contract_dataset_selected | ---
dataset_info:
features:
- name: creation_bytecode
dtype: string
- name: malicious
dtype: string
splits:
- name: train
num_bytes: 3113659829
num_examples: 139600
download_size: 663031998
dataset_size: 3113659829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "malicious_smart_contract_dataset_selected"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erenfazlioglu/turkishneuralvoice | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 5933166725.824
num_examples: 130634
download_size: 5547933432
dataset_size: 5933166725.824
---
# Dataset Card for "turkishneuralvoice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AshrafAlAodat/sinograms | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 49082809.0
num_examples: 1400
download_size: 48978515
dataset_size: 49082809.0
---
# Dataset Card for "sinograms"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
timestap/fighter_jet_captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4591975.0
num_examples: 25
download_size: 4584088
dataset_size: 4591975.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fighter_jet_captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_invariant_tag_fronted_isnt | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 319
num_examples: 5
download_size: 2117
dataset_size: 319
---
# Dataset Card for "MULTI_VALUE_cola_invariant_tag_fronted_isnt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imvladikon/qqp_he | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: question1_he
dtype: string
- name: question2_he
dtype: string
- name: labse_score
dtype: float64
splits:
- name: train
num_bytes: 118297851
num_examples: 359985
- name: validation
num_bytes: 13144351
num_examples: 39998
- name: test
num_bytes: 109317000
num_examples: 329982
download_size: 147357764
dataset_size: 240759202
task_categories:
- sentence-similarity
language:
- he
- en
---
# Dataset Card for "qqp_he"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Machine-translated(google) QQP corpus to Hebrew
### Sample
```json
{'idx': 0,
'label': 0,
'labse_score': 0.536876916885376,
'question1': 'How is the life of a math student? Could you describe your own '
'experiences?',
'question1_he': 'איך החיים של תלמיד למתמטיקה? האם תוכל לתאר את החוויות שלך?',
'question2': 'Which level of prepration is enough for the exam jlpt5?',
'question2_he': 'איזו רמת הכנה מספיקה לבחינה jlpt n5?'}
``` |
mischel/Dataset_Ins_Test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 507941
num_examples: 1661
download_size: 139651
dataset_size: 507941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Dataset_Ins_Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrbesher/tr-paraphrase-tatoeba | ---
license: cc-by-4.0
---
|
roa7n/patched_test_p_20_f_SPOUT_v4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 517784141
num_examples: 1607399
download_size: 52108156
dataset_size: 517784141
---
# Dataset Card for "patched_test_p_20_f_SPOUT_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/futaba_sana_puellamagimadokamagicasidestorymagiarecord | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Futaba Sana
This is the dataset of Futaba Sana, containing 152 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 152 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 348 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 152 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 152 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 152 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 152 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 152 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 348 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 348 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 348 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
luna-code/dspy | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: api
dtype: string
splits:
- name: train
num_bytes: 901376.0
num_examples: 249
download_size: 196638
dataset_size: 901376.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tr416/dataset_20231007_024754 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73962
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231007_024754"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akhileshav8/my_dataset_class | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Cavity
'1': Fillings
'2': Impacted Tooth
'3': Implant
splits:
- name: train
num_bytes: 33784205.219
num_examples: 2129
download_size: 33211118
dataset_size: 33784205.219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Seanxh/twitter_dataset_1713014112 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 280263
num_examples: 784
download_size: 105494
dataset_size: 280263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_existential_possessives | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 185
num_examples: 3
- name: test
num_bytes: 159
num_examples: 3
- name: train
num_bytes: 1578
num_examples: 25
download_size: 6368
dataset_size: 1922
---
# Dataset Card for "MULTI_VALUE_cola_existential_possessives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jero98772/CuraPeces_Removed_background | ---
license: cc-by-4.0
---
# Dataset of fish diases
### Description:
The Fish Disease Dataset is a collection of images depicting various diseases affecting fish. These images have been curated and labeled for the purpose of training and evaluating machine learning models for the detection and classification of fish diseases. The dataset covers a diverse range of fish species and diseases commonly observed in aquaculture and natural environments.
(we get it of manual search in internet in 2017, in websites like facebook,google,duckduckgo ...)
### Dataset Composition:
Images: The dataset contains high-resolution images of fish exhibiting symptoms of different diseases. These images capture various perspectives and conditions to ensure diversity and robustness in model training.
Purpose:
make a model for identify diases of domestic fishes
### Potential Applications:
Development of automated systems for early detection and diagnosis of fish diseases in aquaculture facilities.
Research on the epidemiology and spread of various fish diseases in different geographical regions.
Training and evaluation of machine learning models for real-time monitoring of fish health in natural habitats and aquaculture environments.
Data Usage:
The Fish Disease Dataset is made freely available for non-commercial research and educational purposes. Users are encouraged to cite the source of the dataset in their publications and provide appropriate attribution to the contributors.
### Contributing:
Contributions to the Fish Disease Dataset are welcome and encouraged. If you have additional images or annotations that could enhance the quality and diversity of the dataset, please reach out to the maintainers for potential inclusion.
### Contact Information:
For inquiries, feedback, or collaboration opportunities related to the Fish Disease Dataset, please contact [curapeces@gmail.com].
### License:
The Fish Disease Dataset is released under [insert license type, e.g., Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License]. Please refer to the accompanying license file for detailed terms and conditions of use.
### Note, this readme was made by GPT |
MikhailT/voxpopuli-en | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
- name: speaker_embeddings
sequence: float32
splits:
- name: train
num_bytes: 2388645494.4157987
num_examples: 11871
- name: test
num_bytes: 265606271.8076703
num_examples: 1320
download_size: 1938036247
dataset_size: 2654251766.223469
---
# Dataset Card for "voxpopuli-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NickyNicky/aya_dataset_multilingual_chatml_gemma_response_json | ---
dataset_info:
features:
- name: text
dtype: string
- name: len_tokens
dtype: int64
splits:
- name: train
num_bytes: 44143916
num_examples: 48016
download_size: 8526144
dataset_size: 44143916
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
#tokenizer: google/gemma-2b-it
# hist len_tokens

```
<bos><start_of_turn>system
You are a helpful AI assistant.
solo responde en formato json.
lista de codigos linguisticos disponibles: ["es", "en", "fr", "de"].<end_of_turn>
<start_of_turn>user
{ "input": "es",
"targets": "fr",
"inputs_es": "¿Qué presidente de los Estados Unidos nunca se ha casado?"
}<end_of_turn>
<start_of_turn>model
{ "targets": "fr",
"targets_fr": "James Buchanan est le seul président qui ne s'est jamais marié."
}<end_of_turn><eos>
```
# describe.

# percentil.

|
whyoke/segmentation_drone | ---
dataset_info:
features:
- name: image
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 469141459.0
num_examples: 350
- name: annotation
num_bytes: 53547177.0
num_examples: 40
download_size: 522729573
dataset_size: 522688636.0
---
# Dataset Card for "segmentation_drone"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
approach0/mathy-phase2 | ---
dataset_info:
features:
- name: problem
dtype: string
- name: query
dtype: string
- name: prompt
dtype: string
- name: solution
dtype: string
- name: ground_truth
dtype: 'null'
- name: judge_buffer
dtype: 'null'
- name: manual_query
dtype: 'null'
- name: manual_rating
dtype: int64
- name: args
dtype: string
splits:
- name: train
num_bytes: 470590.71186440677
num_examples: 114
- name: test
num_bytes: 260063.28813559323
num_examples: 63
download_size: 0
dataset_size: 730654.0
---
# Dataset Card for "mathy-phase2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/loc_maps | Invalid username or password. |
voidful/DRCD | ---
license: cc-by-3.0
---
|
TempoFunk/tempofunk-sdance | ---
task_categories:
- text-to-video
- text-to-image
- video-classification
- image-classification
language:
- en
size_categories:
- 1K<n<10K
license: agpl-3.0
---
# TempoFunk S(mall)Dance
10k samples of metadata and encoded latents & prompts of videos themed around **dance**.
## Data format
- Video frame latents
- Numpy arrays
- 120 frames, 512x512 source size
- Encoded shape (120, 4, 64, 64)
- CLIP (openai) encoded prompts
- Video description (as seen in metadata)
- Encoded shape (77,768)
- Video metadata as JSON (description, tags, categories, source URLs, etc.) |
andreabac3/StackOverflow-Italian-Fauno-Baize | ---
license: gpl-3.0
---
# StackOverflow-Italian-Fauno-Baize
This dataset is an Italian translation of the StackOverflow dataset presented by Baize's authors.
## Dataset Description
- **Paper:** https://arxiv.org/abs/2304.01196
### Languages
Italian
## Dataset Structure
### Data Instances
Sentences 57,046
average number of turns 3.6
response lengths of each turn 36.0
### Data Fields
topic, input
### Data Splits
Train
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
https://github.com/project-baize/baize-chatbot
## Additional Information
### Dataset Curators
[Andrea Bacciu](https://andreabac3.github.io/), Dr. [Giovanni Trappolini](https://sites.google.com/view/giovannitrappolini), [Andrea Santilli](https://www.santilli.xyz/), and Professor [Fabrizio Silvestri](https://sites.google.com/diag.uniroma1.it/fabriziosilvestri/home).
### Licensing Information
This project is a derivative of Baize, and we adhere to the licensing constraints imposed by Baize's creators.
### Citation Information
```bibtex
@misc{fauno,
author = {Andrea Bacciu, Giovanni Trappolini, Andrea Santilli, Fabrizio Silvestri},
title = {Fauno: The Italian Large Language Model that will leave you senza parole!},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/andreabac3/Fauno-Italian-LLM}},
}
```
```bibtex
@article{xu2023baize,
title={Baize: An Open-Source Chat Model with Parameter-Efficient Tuning on Self-Chat Data},
author={Xu, Canwen and Guo, Daya and Duan, Nan and McAuley, Julian},
journal={arXiv preprint arXiv:2304.01196},
year={2023}
}
``` |
basvojunagasai/test_data_set_basvoj | ---
license: unknown
---
|
open-llm-leaderboard/details_voidful__phi-1_5_chat | ---
pretty_name: Evaluation run of voidful/phi-1_5_chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [voidful/phi-1_5_chat](https://huggingface.co/voidful/phi-1_5_chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_voidful__phi-1_5_chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T21:35:53.785866](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__phi-1_5_chat/blob/main/results_2024-04-15T21-35-53.785866.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4090420452199382,\n\
\ \"acc_stderr\": 0.03452005539277286,\n \"acc_norm\": 0.4101124129738508,\n\
\ \"acc_norm_stderr\": 0.03528015676774361,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.43943840992859684,\n\
\ \"mc2_stderr\": 0.015111114848764144\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.014599131353035005,\n\
\ \"acc_norm\": 0.4991467576791809,\n \"acc_norm_stderr\": 0.014611369529813276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4631547500497909,\n\
\ \"acc_stderr\": 0.004976214989483505,\n \"acc_norm\": 0.610336586337383,\n\
\ \"acc_norm_stderr\": 0.00486677237302994\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.03047144586718324,\n\
\ \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.03047144586718324\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3574468085106383,\n \"acc_stderr\": 0.03132941789476425,\n\
\ \"acc_norm\": 0.3574468085106383,\n \"acc_norm_stderr\": 0.03132941789476425\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112126,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112126\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4129032258064516,\n \"acc_stderr\": 0.028009138125400387,\n \"\
acc_norm\": 0.4129032258064516,\n \"acc_norm_stderr\": 0.028009138125400387\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.032257994762334846,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.032257994762334846\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n\
\ \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.025141801511177495,\n\
\ \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.025141801511177495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5541284403669725,\n \"acc_stderr\": 0.021311335009708575,\n \"\
acc_norm\": 0.5541284403669725,\n \"acc_norm_stderr\": 0.021311335009708575\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.459915611814346,\n \"acc_stderr\": 0.03244246810187913,\n \
\ \"acc_norm\": 0.459915611814346,\n \"acc_norm_stderr\": 0.03244246810187913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4484304932735426,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.4484304932735426,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.512396694214876,\n \"acc_stderr\": 0.045629515481807666,\n \"\
acc_norm\": 0.512396694214876,\n \"acc_norm_stderr\": 0.045629515481807666\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.03901591825836183,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.03901591825836183\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.0449394906861354,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.0449394906861354\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6153846153846154,\n\
\ \"acc_stderr\": 0.03187195347942466,\n \"acc_norm\": 0.6153846153846154,\n\
\ \"acc_norm_stderr\": 0.03187195347942466\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.4367816091954023,\n \"acc_stderr\": 0.017736470837800684,\n\
\ \"acc_norm\": 0.4367816091954023,\n \"acc_norm_stderr\": 0.017736470837800684\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4595375722543353,\n\
\ \"acc_stderr\": 0.026830805998952233,\n \"acc_norm\": 0.4595375722543353,\n\
\ \"acc_norm_stderr\": 0.026830805998952233\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859926,\n\
\ \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859926\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.45751633986928103,\n\
\ \"acc_stderr\": 0.028526383452142635,\n \"acc_norm\": 0.45751633986928103,\n\
\ \"acc_norm_stderr\": 0.028526383452142635\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.40514469453376206,\n \"acc_stderr\": 0.02788238379132595,\n\
\ \"acc_norm\": 0.40514469453376206,\n \"acc_norm_stderr\": 0.02788238379132595\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3950617283950617,\n\
\ \"acc_stderr\": 0.027201117666925657,\n \"acc_norm\": 0.3950617283950617,\n\
\ \"acc_norm_stderr\": 0.027201117666925657\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880582,\n\
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880582\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31029986962190353,\n\
\ \"acc_stderr\": 0.011815439293469836,\n \"acc_norm\": 0.31029986962190353,\n\
\ \"acc_norm_stderr\": 0.011815439293469836\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.028739328513983576,\n\
\ \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.028739328513983576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3562091503267974,\n \"acc_stderr\": 0.019373332420724504,\n \
\ \"acc_norm\": 0.3562091503267974,\n \"acc_norm_stderr\": 0.019373332420724504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5373134328358209,\n\
\ \"acc_stderr\": 0.03525675167467974,\n \"acc_norm\": 0.5373134328358209,\n\
\ \"acc_norm_stderr\": 0.03525675167467974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.39766081871345027,\n \"acc_stderr\": 0.0375363895576169,\n\
\ \"acc_norm\": 0.39766081871345027,\n \"acc_norm_stderr\": 0.0375363895576169\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.43943840992859684,\n\
\ \"mc2_stderr\": 0.015111114848764144\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7071823204419889,\n \"acc_stderr\": 0.012789321118542604\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21455648218347234,\n \
\ \"acc_stderr\": 0.011307604104052882\n }\n}\n```"
repo_url: https://huggingface.co/voidful/phi-1_5_chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|arc:challenge|25_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|gsm8k|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hellaswag|10_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-10-56.646250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-35-53.785866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-35-53.785866.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- '**/details_harness|winogrande|5_2024-04-02T17-10-56.646250.parquet'
- split: 2024_04_15T21_35_53.785866
path:
- '**/details_harness|winogrande|5_2024-04-15T21-35-53.785866.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T21-35-53.785866.parquet'
- config_name: results
data_files:
- split: 2024_04_02T17_10_56.646250
path:
- results_2024-04-02T17-10-56.646250.parquet
- split: 2024_04_15T21_35_53.785866
path:
- results_2024-04-15T21-35-53.785866.parquet
- split: latest
path:
- results_2024-04-15T21-35-53.785866.parquet
---
# Dataset Card for Evaluation run of voidful/phi-1_5_chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [voidful/phi-1_5_chat](https://huggingface.co/voidful/phi-1_5_chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_voidful__phi-1_5_chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T21:35:53.785866](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__phi-1_5_chat/blob/main/results_2024-04-15T21-35-53.785866.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4090420452199382,
"acc_stderr": 0.03452005539277286,
"acc_norm": 0.4101124129738508,
"acc_norm_stderr": 0.03528015676774361,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.43943840992859684,
"mc2_stderr": 0.015111114848764144
},
"harness|arc:challenge|25": {
"acc": 0.47952218430034127,
"acc_stderr": 0.014599131353035005,
"acc_norm": 0.4991467576791809,
"acc_norm_stderr": 0.014611369529813276
},
"harness|hellaswag|10": {
"acc": 0.4631547500497909,
"acc_stderr": 0.004976214989483505,
"acc_norm": 0.610336586337383,
"acc_norm_stderr": 0.00486677237302994
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.03047144586718324,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.03047144586718324
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3574468085106383,
"acc_stderr": 0.03132941789476425,
"acc_norm": 0.3574468085106383,
"acc_norm_stderr": 0.03132941789476425
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112126,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112126
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4129032258064516,
"acc_stderr": 0.028009138125400387,
"acc_norm": 0.4129032258064516,
"acc_norm_stderr": 0.028009138125400387
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.032257994762334846,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.032257994762334846
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5233160621761658,
"acc_stderr": 0.03604513672442202,
"acc_norm": 0.5233160621761658,
"acc_norm_stderr": 0.03604513672442202
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.025141801511177495,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.025141801511177495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5541284403669725,
"acc_stderr": 0.021311335009708575,
"acc_norm": 0.5541284403669725,
"acc_norm_stderr": 0.021311335009708575
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.459915611814346,
"acc_stderr": 0.03244246810187913,
"acc_norm": 0.459915611814346,
"acc_norm_stderr": 0.03244246810187913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4484304932735426,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.4484304932735426,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.045629515481807666,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.045629515481807666
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.048129173245368216,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.048129173245368216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.03901591825836183,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.03901591825836183
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.0449394906861354,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.0449394906861354
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.03187195347942466,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.03187195347942466
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4367816091954023,
"acc_stderr": 0.017736470837800684,
"acc_norm": 0.4367816091954023,
"acc_norm_stderr": 0.017736470837800684
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4595375722543353,
"acc_stderr": 0.026830805998952233,
"acc_norm": 0.4595375722543353,
"acc_norm_stderr": 0.026830805998952233
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45751633986928103,
"acc_stderr": 0.028526383452142635,
"acc_norm": 0.45751633986928103,
"acc_norm_stderr": 0.028526383452142635
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40514469453376206,
"acc_stderr": 0.02788238379132595,
"acc_norm": 0.40514469453376206,
"acc_norm_stderr": 0.02788238379132595
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3950617283950617,
"acc_stderr": 0.027201117666925657,
"acc_norm": 0.3950617283950617,
"acc_norm_stderr": 0.027201117666925657
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880582,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880582
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31029986962190353,
"acc_stderr": 0.011815439293469836,
"acc_norm": 0.31029986962190353,
"acc_norm_stderr": 0.011815439293469836
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3562091503267974,
"acc_stderr": 0.019373332420724504,
"acc_norm": 0.3562091503267974,
"acc_norm_stderr": 0.019373332420724504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5373134328358209,
"acc_stderr": 0.03525675167467974,
"acc_norm": 0.5373134328358209,
"acc_norm_stderr": 0.03525675167467974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.39766081871345027,
"acc_stderr": 0.0375363895576169,
"acc_norm": 0.39766081871345027,
"acc_norm_stderr": 0.0375363895576169
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.43943840992859684,
"mc2_stderr": 0.015111114848764144
},
"harness|winogrande|5": {
"acc": 0.7071823204419889,
"acc_stderr": 0.012789321118542604
},
"harness|gsm8k|5": {
"acc": 0.21455648218347234,
"acc_stderr": 0.011307604104052882
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_rwitz2__ipo-test | ---
pretty_name: Evaluation run of rwitz2/ipo-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rwitz2/ipo-test](https://huggingface.co/rwitz2/ipo-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rwitz2__ipo-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-12T03:53:21.138621](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz2__ipo-test/blob/main/results_2023-12-12T03-53-21.138621.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543450273126857,\n\
\ \"acc_stderr\": 0.03191864171781636,\n \"acc_norm\": 0.6545137141283983,\n\
\ \"acc_norm_stderr\": 0.03257628315307556,\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.558695592929387,\n\
\ \"mc2_stderr\": 0.015276769304708891\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175456,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946533\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6694881497709619,\n\
\ \"acc_stderr\": 0.004694360968929403,\n \"acc_norm\": 0.8598884684325832,\n\
\ \"acc_norm_stderr\": 0.003463933286063885\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290895,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290895\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.01651367603117959,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.01651367603117959\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.558695592929387,\n\
\ \"mc2_stderr\": 0.015276769304708891\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510427\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7202426080363912,\n \
\ \"acc_stderr\": 0.012364384016735319\n }\n}\n```"
repo_url: https://huggingface.co/rwitz2/ipo-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|arc:challenge|25_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|gsm8k|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hellaswag|10_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T03-53-21.138621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T03-53-21.138621.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- '**/details_harness|winogrande|5_2023-12-12T03-53-21.138621.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-12T03-53-21.138621.parquet'
- config_name: results
data_files:
- split: 2023_12_12T03_53_21.138621
path:
- results_2023-12-12T03-53-21.138621.parquet
- split: latest
path:
- results_2023-12-12T03-53-21.138621.parquet
---
# Dataset Card for Evaluation run of rwitz2/ipo-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rwitz2/ipo-test](https://huggingface.co/rwitz2/ipo-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rwitz2__ipo-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-12T03:53:21.138621](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz2__ipo-test/blob/main/results_2023-12-12T03-53-21.138621.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543450273126857,
"acc_stderr": 0.03191864171781636,
"acc_norm": 0.6545137141283983,
"acc_norm_stderr": 0.03257628315307556,
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.558695592929387,
"mc2_stderr": 0.015276769304708891
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175456,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946533
},
"harness|hellaswag|10": {
"acc": 0.6694881497709619,
"acc_stderr": 0.004694360968929403,
"acc_norm": 0.8598884684325832,
"acc_norm_stderr": 0.003463933286063885
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290895,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290895
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.01651367603117959,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.01651367603117959
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.558695592929387,
"mc2_stderr": 0.015276769304708891
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510427
},
"harness|gsm8k|5": {
"acc": 0.7202426080363912,
"acc_stderr": 0.012364384016735319
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kardosdrur/dfm_domain_classification | ---
dataset_info:
features:
- name: content
dtype: string
- name: domain
dtype: string
splits:
- name: train
num_bytes: 40202954.4
num_examples: 80000
- name: test
num_bytes: 10050738.6
num_examples: 20000
download_size: 33465068
dataset_size: 50253693.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_irrealis_be_done | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 131965
num_examples: 492
- name: dev_mismatched
num_bytes: 113947
num_examples: 425
- name: test_matched
num_bytes: 113038
num_examples: 448
- name: test_mismatched
num_bytes: 97833
num_examples: 390
- name: train
num_bytes: 4942695
num_examples: 18765
download_size: 3211083
dataset_size: 5399478
---
# Dataset Card for "MULTI_VALUE_mnli_irrealis_be_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_wrong_rare_v5_full_recite_ans_sent_random_permute_rerun_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4503721.11061151
num_examples: 2875
- name: validation
num_bytes: 409972
num_examples: 300
download_size: 1373318
dataset_size: 4913693.11061151
---
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_ans_sent_random_permute_rerun_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juancopi81/test-sam-1 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: conditioning_image
dtype: image
- name: overlaid
dtype: image
- name: caption
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 2748434.0
num_examples: 5
download_size: 2753855
dataset_size: 2748434.0
---
# Dataset Card for "test-sam-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Amit19july/simpleODdataset | ---
license: other
---
|
open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16 | ---
pretty_name: Evaluation run of fblgit/una-cybertron-7b-v1-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fblgit/una-cybertron-7b-v1-fp16](https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T16:23:37.533105](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16/blob/main/results_2023-12-04T16-23-37.533105.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6356711503628629,\n\
\ \"acc_stderr\": 0.03264369072727708,\n \"acc_norm\": 0.6379873148773121,\n\
\ \"acc_norm_stderr\": 0.03330588124087063,\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.632786784829325,\n\
\ \"mc2_stderr\": 0.015062396850296454\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.01395241369960094,\n\
\ \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6598287193786099,\n\
\ \"acc_stderr\": 0.0047279834341954945,\n \"acc_norm\": 0.8542123083051185,\n\
\ \"acc_norm_stderr\": 0.0035217202839105555\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601457,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601457\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579832,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.016251139711570776,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.016251139711570776\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881876,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881876\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537365,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206244,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206244\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223974,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223974\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.632786784829325,\n\
\ \"mc2_stderr\": 0.015062396850296454\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5511751326762699,\n \
\ \"acc_stderr\": 0.013700157442788071\n }\n}\n```"
repo_url: https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-23-37.533105.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- '**/details_harness|winogrande|5_2023-12-04T16-23-37.533105.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T16-23-37.533105.parquet'
- config_name: results
data_files:
- split: 2023_12_04T16_23_37.533105
path:
- results_2023-12-04T16-23-37.533105.parquet
- split: latest
path:
- results_2023-12-04T16-23-37.533105.parquet
---
# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v1-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fblgit/una-cybertron-7b-v1-fp16](https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T16:23:37.533105](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16/blob/main/results_2023-12-04T16-23-37.533105.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6356711503628629,
"acc_stderr": 0.03264369072727708,
"acc_norm": 0.6379873148773121,
"acc_norm_stderr": 0.03330588124087063,
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.632786784829325,
"mc2_stderr": 0.015062396850296454
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.01395241369960094,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815291
},
"harness|hellaswag|10": {
"acc": 0.6598287193786099,
"acc_stderr": 0.0047279834341954945,
"acc_norm": 0.8542123083051185,
"acc_norm_stderr": 0.0035217202839105555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092437,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579832,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570776,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881876,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881876
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537365,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206244,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223974,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223974
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.632786784829325,
"mc2_stderr": 0.015062396850296454
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.01094187795567621
},
"harness|gsm8k|5": {
"acc": 0.5511751326762699,
"acc_stderr": 0.013700157442788071
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sproos/twitter-pairclass-sw | ---
dataset_info:
features:
- name: sent1
sequence: string
- name: sent2
sequence: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 10795702
num_examples: 1
download_size: 4444037
dataset_size: 10795702
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter-pairclass-sw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alobaidizt/faq-sample-embeddings | ---
license: mit
---
|
narySt/CommitChronicle_valPretrained | ---
dataset_info:
features:
- name: message
dtype: string
- name: model_input
dtype: string
- name: input_ids
sequence: int64
- name: attention_mask
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1432270229
num_examples: 109505
download_size: 81607985
dataset_size: 1432270229
---
# Dataset Card for "CommitChronicle_valPretrained"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pythainlp/final_training_set_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: metadata
struct:
- name: source
dtype: string
- name: nb_token
dtype: int64
splits:
- name: train
num_bytes: 337155434.9768474
num_examples: 405760
- name: test
num_bytes: 1277960.0231525812
num_examples: 1538
download_size: 191404581
dataset_size: 338433395
task_categories:
- conversational
- text-generation
language:
- en
---
# Dataset Card for "final_training_set_v1"
Finetuning datasets for [WangChanGLM](https://github.com/pythainlp/wangchanglm) sourced from [LAION OIG chip2 and infill_dbpedia](https://huggingface.co/datasets/laion/OIG) ([Apache-2.0](https://github.com/pythainlp/wangchanglm/blob/main/LICENSE)), [DataBricks Dolly v2](https://github.com/databrickslabs/dolly) ([Apache-2.0](https://github.com/pythainlp/wangchanglm/blob/main/LICENSE)), [OpenAI TL;DR](https://github.com/openai/summarize-from-feedback) ([MIT](https://opensource.org/license/mit/)), and [Hello-SimpleAI HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3) ([CC-BY SA](https://creativecommons.org/licenses/by-sa/4.0/)) |
chnwentao/RAG_data | ---
license: apache-2.0
---
|
heliosprime/twitter_dataset_1712948009 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8399
num_examples: 20
download_size: 8954
dataset_size: 8399
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712948009"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
visual-layer/vl-imagenet-1k | ---
license: other
dataset_info:
features:
- name: image
dtype:
image:
decode: false
- name: label
dtype:
class_label:
names:
0: tench, Tinca tinca
1: goldfish, Carassius auratus
2: great white shark, white shark, man-eater, man-eating shark, Carcharodon
carcharias
3: tiger shark, Galeocerdo cuvieri
4: hammerhead, hammerhead shark
5: electric ray, crampfish, numbfish, torpedo
6: stingray
7: cock
8: hen
9: ostrich, Struthio camelus
10: brambling, Fringilla montifringilla
11: goldfinch, Carduelis carduelis
12: house finch, linnet, Carpodacus mexicanus
13: junco, snowbird
14: indigo bunting, indigo finch, indigo bird, Passerina cyanea
15: robin, American robin, Turdus migratorius
16: bulbul
17: jay
18: magpie
19: chickadee
20: water ouzel, dipper
21: kite
22: bald eagle, American eagle, Haliaeetus leucocephalus
23: vulture
24: great grey owl, great gray owl, Strix nebulosa
25: European fire salamander, Salamandra salamandra
26: common newt, Triturus vulgaris
27: eft
28: spotted salamander, Ambystoma maculatum
29: axolotl, mud puppy, Ambystoma mexicanum
30: bullfrog, Rana catesbeiana
31: tree frog, tree-frog
32: tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui
33: loggerhead, loggerhead turtle, Caretta caretta
34: leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea
35: mud turtle
36: terrapin
37: box turtle, box tortoise
38: banded gecko
39: common iguana, iguana, Iguana iguana
40: American chameleon, anole, Anolis carolinensis
41: whiptail, whiptail lizard
42: agama
43: frilled lizard, Chlamydosaurus kingi
44: alligator lizard
45: Gila monster, Heloderma suspectum
46: green lizard, Lacerta viridis
47: African chameleon, Chamaeleo chamaeleon
48: Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis
49: African crocodile, Nile crocodile, Crocodylus niloticus
50: American alligator, Alligator mississipiensis
51: triceratops
52: thunder snake, worm snake, Carphophis amoenus
53: ringneck snake, ring-necked snake, ring snake
54: hognose snake, puff adder, sand viper
55: green snake, grass snake
56: king snake, kingsnake
57: garter snake, grass snake
58: water snake
59: vine snake
60: night snake, Hypsiglena torquata
61: boa constrictor, Constrictor constrictor
62: rock python, rock snake, Python sebae
63: Indian cobra, Naja naja
64: green mamba
65: sea snake
66: horned viper, cerastes, sand viper, horned asp, Cerastes cornutus
67: diamondback, diamondback rattlesnake, Crotalus adamanteus
68: sidewinder, horned rattlesnake, Crotalus cerastes
69: trilobite
70: harvestman, daddy longlegs, Phalangium opilio
71: scorpion
72: black and gold garden spider, Argiope aurantia
73: barn spider, Araneus cavaticus
74: garden spider, Aranea diademata
75: black widow, Latrodectus mactans
76: tarantula
77: wolf spider, hunting spider
78: tick
79: centipede
80: black grouse
81: ptarmigan
82: ruffed grouse, partridge, Bonasa umbellus
83: prairie chicken, prairie grouse, prairie fowl
84: peacock
85: quail
86: partridge
87: African grey, African gray, Psittacus erithacus
88: macaw
89: sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita
90: lorikeet
91: coucal
92: bee eater
93: hornbill
94: hummingbird
95: jacamar
96: toucan
97: drake
98: red-breasted merganser, Mergus serrator
99: goose
100: black swan, Cygnus atratus
101: tusker
102: echidna, spiny anteater, anteater
103: platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus
anatinus
104: wallaby, brush kangaroo
105: koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus
106: wombat
107: jellyfish
108: sea anemone, anemone
109: brain coral
110: flatworm, platyhelminth
111: nematode, nematode worm, roundworm
112: conch
113: snail
114: slug
115: sea slug, nudibranch
116: chiton, coat-of-mail shell, sea cradle, polyplacophore
117: chambered nautilus, pearly nautilus, nautilus
118: Dungeness crab, Cancer magister
119: rock crab, Cancer irroratus
120: fiddler crab
121: king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes
camtschatica
122: American lobster, Northern lobster, Maine lobster, Homarus americanus
123: spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish
124: crayfish, crawfish, crawdad, crawdaddy
125: hermit crab
126: isopod
127: white stork, Ciconia ciconia
128: black stork, Ciconia nigra
129: spoonbill
130: flamingo
131: little blue heron, Egretta caerulea
132: American egret, great white heron, Egretta albus
133: bittern
134: crane
135: limpkin, Aramus pictus
136: European gallinule, Porphyrio porphyrio
137: American coot, marsh hen, mud hen, water hen, Fulica americana
138: bustard
139: ruddy turnstone, Arenaria interpres
140: red-backed sandpiper, dunlin, Erolia alpina
141: redshank, Tringa totanus
142: dowitcher
143: oystercatcher, oyster catcher
144: pelican
145: king penguin, Aptenodytes patagonica
146: albatross, mollymawk
147: grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius
robustus
148: killer whale, killer, orca, grampus, sea wolf, Orcinus orca
149: dugong, Dugong dugon
150: sea lion
151: Chihuahua
152: Japanese spaniel
153: Maltese dog, Maltese terrier, Maltese
154: Pekinese, Pekingese, Peke
155: Shih-Tzu
156: Blenheim spaniel
157: papillon
158: toy terrier
159: Rhodesian ridgeback
160: Afghan hound, Afghan
161: basset, basset hound
162: beagle
163: bloodhound, sleuthhound
164: bluetick
165: black-and-tan coonhound
166: Walker hound, Walker foxhound
167: English foxhound
168: redbone
169: borzoi, Russian wolfhound
170: Irish wolfhound
171: Italian greyhound
172: whippet
173: Ibizan hound, Ibizan Podenco
174: Norwegian elkhound, elkhound
175: otterhound, otter hound
176: Saluki, gazelle hound
177: Scottish deerhound, deerhound
178: Weimaraner
179: Staffordshire bullterrier, Staffordshire bull terrier
180: American Staffordshire terrier, Staffordshire terrier, American pit
bull terrier, pit bull terrier
181: Bedlington terrier
182: Border terrier
183: Kerry blue terrier
184: Irish terrier
185: Norfolk terrier
186: Norwich terrier
187: Yorkshire terrier
188: wire-haired fox terrier
189: Lakeland terrier
190: Sealyham terrier, Sealyham
191: Airedale, Airedale terrier
192: cairn, cairn terrier
193: Australian terrier
194: Dandie Dinmont, Dandie Dinmont terrier
195: Boston bull, Boston terrier
196: miniature schnauzer
197: giant schnauzer
198: standard schnauzer
199: Scotch terrier, Scottish terrier, Scottie
200: Tibetan terrier, chrysanthemum dog
201: silky terrier, Sydney silky
202: soft-coated wheaten terrier
203: West Highland white terrier
204: Lhasa, Lhasa apso
205: flat-coated retriever
206: curly-coated retriever
207: golden retriever
208: Labrador retriever
209: Chesapeake Bay retriever
210: German short-haired pointer
211: vizsla, Hungarian pointer
212: English setter
213: Irish setter, red setter
214: Gordon setter
215: Brittany spaniel
216: clumber, clumber spaniel
217: English springer, English springer spaniel
218: Welsh springer spaniel
219: cocker spaniel, English cocker spaniel, cocker
220: Sussex spaniel
221: Irish water spaniel
222: kuvasz
223: schipperke
224: groenendael
225: malinois
226: briard
227: kelpie
228: komondor
229: Old English sheepdog, bobtail
230: Shetland sheepdog, Shetland sheep dog, Shetland
231: collie
232: Border collie
233: Bouvier des Flandres, Bouviers des Flandres
234: Rottweiler
235: German shepherd, German shepherd dog, German police dog, alsatian
236: Doberman, Doberman pinscher
237: miniature pinscher
238: Greater Swiss Mountain dog
239: Bernese mountain dog
240: Appenzeller
241: EntleBucher
242: boxer
243: bull mastiff
244: Tibetan mastiff
245: French bulldog
246: Great Dane
247: Saint Bernard, St Bernard
248: Eskimo dog, husky
249: malamute, malemute, Alaskan malamute
250: Siberian husky
251: dalmatian, coach dog, carriage dog
252: affenpinscher, monkey pinscher, monkey dog
253: basenji
254: pug, pug-dog
255: Leonberg
256: Newfoundland, Newfoundland dog
257: Great Pyrenees
258: Samoyed, Samoyede
259: Pomeranian
260: chow, chow chow
261: keeshond
262: Brabancon griffon
263: Pembroke, Pembroke Welsh corgi
264: Cardigan, Cardigan Welsh corgi
265: toy poodle
266: miniature poodle
267: standard poodle
268: Mexican hairless
269: timber wolf, grey wolf, gray wolf, Canis lupus
270: white wolf, Arctic wolf, Canis lupus tundrarum
271: red wolf, maned wolf, Canis rufus, Canis niger
272: coyote, prairie wolf, brush wolf, Canis latrans
273: dingo, warrigal, warragal, Canis dingo
274: dhole, Cuon alpinus
275: African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus
276: hyena, hyaena
277: red fox, Vulpes vulpes
278: kit fox, Vulpes macrotis
279: Arctic fox, white fox, Alopex lagopus
280: grey fox, gray fox, Urocyon cinereoargenteus
281: tabby, tabby cat
282: tiger cat
283: Persian cat
284: Siamese cat, Siamese
285: Egyptian cat
286: cougar, puma, catamount, mountain lion, painter, panther, Felis concolor
287: lynx, catamount
288: leopard, Panthera pardus
289: snow leopard, ounce, Panthera uncia
290: jaguar, panther, Panthera onca, Felis onca
291: lion, king of beasts, Panthera leo
292: tiger, Panthera tigris
293: cheetah, chetah, Acinonyx jubatus
294: brown bear, bruin, Ursus arctos
295: American black bear, black bear, Ursus americanus, Euarctos americanus
296: ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus
297: sloth bear, Melursus ursinus, Ursus ursinus
298: mongoose
299: meerkat, mierkat
300: tiger beetle
301: ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle
302: ground beetle, carabid beetle
303: long-horned beetle, longicorn, longicorn beetle
304: leaf beetle, chrysomelid
305: dung beetle
306: rhinoceros beetle
307: weevil
308: fly
309: bee
310: ant, emmet, pismire
311: grasshopper, hopper
312: cricket
313: walking stick, walkingstick, stick insect
314: cockroach, roach
315: mantis, mantid
316: cicada, cicala
317: leafhopper
318: lacewing, lacewing fly
319: dragonfly, darning needle, devil's darning needle, sewing needle, snake
feeder, snake doctor, mosquito hawk, skeeter hawk
320: damselfly
321: admiral
322: ringlet, ringlet butterfly
323: monarch, monarch butterfly, milkweed butterfly, Danaus plexippus
324: cabbage butterfly
325: sulphur butterfly, sulfur butterfly
326: lycaenid, lycaenid butterfly
327: starfish, sea star
328: sea urchin
329: sea cucumber, holothurian
330: wood rabbit, cottontail, cottontail rabbit
331: hare
332: Angora, Angora rabbit
333: hamster
334: porcupine, hedgehog
335: fox squirrel, eastern fox squirrel, Sciurus niger
336: marmot
337: beaver
338: guinea pig, Cavia cobaya
339: sorrel
340: zebra
341: hog, pig, grunter, squealer, Sus scrofa
342: wild boar, boar, Sus scrofa
343: warthog
344: hippopotamus, hippo, river horse, Hippopotamus amphibius
345: ox
346: water buffalo, water ox, Asiatic buffalo, Bubalus bubalis
347: bison
348: ram, tup
349: bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain
sheep, Ovis canadensis
350: ibex, Capra ibex
351: hartebeest
352: impala, Aepyceros melampus
353: gazelle
354: Arabian camel, dromedary, Camelus dromedarius
355: llama
356: weasel
357: mink
358: polecat, fitch, foulmart, foumart, Mustela putorius
359: black-footed ferret, ferret, Mustela nigripes
360: otter
361: skunk, polecat, wood pussy
362: badger
363: armadillo
364: three-toed sloth, ai, Bradypus tridactylus
365: orangutan, orang, orangutang, Pongo pygmaeus
366: gorilla, Gorilla gorilla
367: chimpanzee, chimp, Pan troglodytes
368: gibbon, Hylobates lar
369: siamang, Hylobates syndactylus, Symphalangus syndactylus
370: guenon, guenon monkey
371: patas, hussar monkey, Erythrocebus patas
372: baboon
373: macaque
374: langur
375: colobus, colobus monkey
376: proboscis monkey, Nasalis larvatus
377: marmoset
378: capuchin, ringtail, Cebus capucinus
379: howler monkey, howler
380: titi, titi monkey
381: spider monkey, Ateles geoffroyi
382: squirrel monkey, Saimiri sciureus
383: Madagascar cat, ring-tailed lemur, Lemur catta
384: indri, indris, Indri indri, Indri brevicaudatus
385: Indian elephant, Elephas maximus
386: African elephant, Loxodonta africana
387: lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens
388: giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca
389: barracouta, snoek
390: eel
391: coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch
392: rock beauty, Holocanthus tricolor
393: anemone fish
394: sturgeon
395: gar, garfish, garpike, billfish, Lepisosteus osseus
396: lionfish
397: puffer, pufferfish, blowfish, globefish
398: abacus
399: abaya
400: academic gown, academic robe, judge's robe
401: accordion, piano accordion, squeeze box
402: acoustic guitar
403: aircraft carrier, carrier, flattop, attack aircraft carrier
404: airliner
405: airship, dirigible
406: altar
407: ambulance
408: amphibian, amphibious vehicle
409: analog clock
410: apiary, bee house
411: apron
412: ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin,
dustbin, trash barrel, trash bin
413: assault rifle, assault gun
414: backpack, back pack, knapsack, packsack, rucksack, haversack
415: bakery, bakeshop, bakehouse
416: balance beam, beam
417: balloon
418: ballpoint, ballpoint pen, ballpen, Biro
419: Band Aid
420: banjo
421: bannister, banister, balustrade, balusters, handrail
422: barbell
423: barber chair
424: barbershop
425: barn
426: barometer
427: barrel, cask
428: barrow, garden cart, lawn cart, wheelbarrow
429: baseball
430: basketball
431: bassinet
432: bassoon
433: bathing cap, swimming cap
434: bath towel
435: bathtub, bathing tub, bath, tub
436: beach wagon, station wagon, wagon, estate car, beach waggon, station
waggon, waggon
437: beacon, lighthouse, beacon light, pharos
438: beaker
439: bearskin, busby, shako
440: beer bottle
441: beer glass
442: bell cote, bell cot
443: bib
444: bicycle-built-for-two, tandem bicycle, tandem
445: bikini, two-piece
446: binder, ring-binder
447: binoculars, field glasses, opera glasses
448: birdhouse
449: boathouse
450: bobsled, bobsleigh, bob
451: bolo tie, bolo, bola tie, bola
452: bonnet, poke bonnet
453: bookcase
454: bookshop, bookstore, bookstall
455: bottlecap
456: bow
457: bow tie, bow-tie, bowtie
458: brass, memorial tablet, plaque
459: brassiere, bra, bandeau
460: breakwater, groin, groyne, mole, bulwark, seawall, jetty
461: breastplate, aegis, egis
462: broom
463: bucket, pail
464: buckle
465: bulletproof vest
466: bullet train, bullet
467: butcher shop, meat market
468: cab, hack, taxi, taxicab
469: caldron, cauldron
470: candle, taper, wax light
471: cannon
472: canoe
473: can opener, tin opener
474: cardigan
475: car mirror
476: carousel, carrousel, merry-go-round, roundabout, whirligig
477: carpenter's kit, tool kit
478: carton
479: car wheel
480: cash machine, cash dispenser, automated teller machine, automatic teller
machine, automated teller, automatic teller, ATM
481: cassette
482: cassette player
483: castle
484: catamaran
485: CD player
486: cello, violoncello
487: cellular telephone, cellular phone, cellphone, cell, mobile phone
488: chain
489: chainlink fence
490: chain mail, ring mail, mail, chain armor, chain armour, ring armor,
ring armour
491: chain saw, chainsaw
492: chest
493: chiffonier, commode
494: chime, bell, gong
495: china cabinet, china closet
496: Christmas stocking
497: church, church building
498: cinema, movie theater, movie theatre, movie house, picture palace
499: cleaver, meat cleaver, chopper
500: cliff dwelling
501: cloak
502: clog, geta, patten, sabot
503: cocktail shaker
504: coffee mug
505: coffeepot
506: coil, spiral, volute, whorl, helix
507: combination lock
508: computer keyboard, keypad
509: confectionery, confectionary, candy store
510: container ship, containership, container vessel
511: convertible
512: corkscrew, bottle screw
513: cornet, horn, trumpet, trump
514: cowboy boot
515: cowboy hat, ten-gallon hat
516: cradle
517: crane2
518: crash helmet
519: crate
520: crib, cot
521: Crock Pot
522: croquet ball
523: crutch
524: cuirass
525: dam, dike, dyke
526: desk
527: desktop computer
528: dial telephone, dial phone
529: diaper, nappy, napkin
530: digital clock
531: digital watch
532: dining table, board
533: dishrag, dishcloth
534: dishwasher, dish washer, dishwashing machine
535: disk brake, disc brake
536: dock, dockage, docking facility
537: dogsled, dog sled, dog sleigh
538: dome
539: doormat, welcome mat
540: drilling platform, offshore rig
541: drum, membranophone, tympan
542: drumstick
543: dumbbell
544: Dutch oven
545: electric fan, blower
546: electric guitar
547: electric locomotive
548: entertainment center
549: envelope
550: espresso maker
551: face powder
552: feather boa, boa
553: file, file cabinet, filing cabinet
554: fireboat
555: fire engine, fire truck
556: fire screen, fireguard
557: flagpole, flagstaff
558: flute, transverse flute
559: folding chair
560: football helmet
561: forklift
562: fountain
563: fountain pen
564: four-poster
565: freight car
566: French horn, horn
567: frying pan, frypan, skillet
568: fur coat
569: garbage truck, dustcart
570: gasmask, respirator, gas helmet
571: gas pump, gasoline pump, petrol pump, island dispenser
572: goblet
573: go-kart
574: golf ball
575: golfcart, golf cart
576: gondola
577: gong, tam-tam
578: gown
579: grand piano, grand
580: greenhouse, nursery, glasshouse
581: grille, radiator grille
582: grocery store, grocery, food market, market
583: guillotine
584: hair slide
585: hair spray
586: half track
587: hammer
588: hamper
589: hand blower, blow dryer, blow drier, hair dryer, hair drier
590: hand-held computer, hand-held microcomputer
591: handkerchief, hankie, hanky, hankey
592: hard disc, hard disk, fixed disk
593: harmonica, mouth organ, harp, mouth harp
594: harp
595: harvester, reaper
596: hatchet
597: holster
598: home theater, home theatre
599: honeycomb
600: hook, claw
601: hoopskirt, crinoline
602: horizontal bar, high bar
603: horse cart, horse-cart
604: hourglass
605: iPod
606: iron, smoothing iron
607: jack-o'-lantern
608: jean, blue jean, denim
609: jeep, landrover
610: jersey, T-shirt, tee shirt
611: jigsaw puzzle
612: jinrikisha, ricksha, rickshaw
613: joystick
614: kimono
615: knee pad
616: knot
617: lab coat, laboratory coat
618: ladle
619: lampshade, lamp shade
620: laptop, laptop computer
621: lawn mower, mower
622: lens cap, lens cover
623: letter opener, paper knife, paperknife
624: library
625: lifeboat
626: lighter, light, igniter, ignitor
627: limousine, limo
628: liner, ocean liner
629: lipstick, lip rouge
630: Loafer
631: lotion
632: loudspeaker, speaker, speaker unit, loudspeaker system, speaker system
633: loupe, jeweler's loupe
634: lumbermill, sawmill
635: magnetic compass
636: mailbag, postbag
637: mailbox, letter box
638: maillot
639: maillot, tank suit
640: manhole cover
641: maraca
642: marimba, xylophone
643: mask
644: matchstick
645: maypole
646: maze, labyrinth
647: measuring cup
648: medicine chest, medicine cabinet
649: megalith, megalithic structure
650: microphone, mike
651: microwave, microwave oven
652: military uniform
653: milk can
654: minibus
655: miniskirt, mini
656: minivan
657: missile
658: mitten
659: mixing bowl
660: mobile home, manufactured home
661: Model T
662: modem
663: monastery
664: monitor
665: moped
666: mortar
667: mortarboard
668: mosque
669: mosquito net
670: motor scooter, scooter
671: mountain bike, all-terrain bike, off-roader
672: mountain tent
673: mouse, computer mouse
674: mousetrap
675: moving van
676: muzzle
677: nail
678: neck brace
679: necklace
680: nipple
681: notebook, notebook computer
682: obelisk
683: oboe, hautboy, hautbois
684: ocarina, sweet potato
685: odometer, hodometer, mileometer, milometer
686: oil filter
687: organ, pipe organ
688: oscilloscope, scope, cathode-ray oscilloscope, CRO
689: overskirt
690: oxcart
691: oxygen mask
692: packet
693: paddle, boat paddle
694: paddlewheel, paddle wheel
695: padlock
696: paintbrush
697: pajama, pyjama, pj's, jammies
698: palace
699: panpipe, pandean pipe, syrinx
700: paper towel
701: parachute, chute
702: parallel bars, bars
703: park bench
704: parking meter
705: passenger car, coach, carriage
706: patio, terrace
707: pay-phone, pay-station
708: pedestal, plinth, footstall
709: pencil box, pencil case
710: pencil sharpener
711: perfume, essence
712: Petri dish
713: photocopier
714: pick, plectrum, plectron
715: pickelhaube
716: picket fence, paling
717: pickup, pickup truck
718: pier
719: piggy bank, penny bank
720: pill bottle
721: pillow
722: ping-pong ball
723: pinwheel
724: pirate, pirate ship
725: pitcher, ewer
726: plane, carpenter's plane, woodworking plane
727: planetarium
728: plastic bag
729: plate rack
730: plow, plough
731: plunger, plumber's helper
732: Polaroid camera, Polaroid Land camera
733: pole
734: police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria
735: poncho
736: pool table, billiard table, snooker table
737: pop bottle, soda bottle
738: pot, flowerpot
739: potter's wheel
740: power drill
741: prayer rug, prayer mat
742: printer
743: prison, prison house
744: projectile, missile
745: projector
746: puck, hockey puck
747: punching bag, punch bag, punching ball, punchball
748: purse
749: quill, quill pen
750: quilt, comforter, comfort, puff
751: racer, race car, racing car
752: racket, racquet
753: radiator
754: radio, wireless
755: radio telescope, radio reflector
756: rain barrel
757: recreational vehicle, RV, R.V.
758: reel
759: reflex camera
760: refrigerator, icebox
761: remote control, remote
762: restaurant, eating house, eating place, eatery
763: revolver, six-gun, six-shooter
764: rifle
765: rocking chair, rocker
766: rotisserie
767: rubber eraser, rubber, pencil eraser
768: rugby ball
769: rule, ruler
770: running shoe
771: safe
772: safety pin
773: saltshaker, salt shaker
774: sandal
775: sarong
776: sax, saxophone
777: scabbard
778: scale, weighing machine
779: school bus
780: schooner
781: scoreboard
782: screen, CRT screen
783: screw
784: screwdriver
785: seat belt, seatbelt
786: sewing machine
787: shield, buckler
788: shoe shop, shoe-shop, shoe store
789: shoji
790: shopping basket
791: shopping cart
792: shovel
793: shower cap
794: shower curtain
795: ski
796: ski mask
797: sleeping bag
798: slide rule, slipstick
799: sliding door
800: slot, one-armed bandit
801: snorkel
802: snowmobile
803: snowplow, snowplough
804: soap dispenser
805: soccer ball
806: sock
807: solar dish, solar collector, solar furnace
808: sombrero
809: soup bowl
810: space bar
811: space heater
812: space shuttle
813: spatula
814: speedboat
815: spider web, spider's web
816: spindle
817: sports car, sport car
818: spotlight, spot
819: stage
820: steam locomotive
821: steel arch bridge
822: steel drum
823: stethoscope
824: stole
825: stone wall
826: stopwatch, stop watch
827: stove
828: strainer
829: streetcar, tram, tramcar, trolley, trolley car
830: stretcher
831: studio couch, day bed
832: stupa, tope
833: submarine, pigboat, sub, U-boat
834: suit, suit of clothes
835: sundial
836: sunglass
837: sunglasses, dark glasses, shades
838: sunscreen, sunblock, sun blocker
839: suspension bridge
840: swab, swob, mop
841: sweatshirt
842: swimming trunks, bathing trunks
843: swing
844: switch, electric switch, electrical switch
845: syringe
846: table lamp
847: tank, army tank, armored combat vehicle, armoured combat vehicle
848: tape player
849: teapot
850: teddy, teddy bear
851: television, television system
852: tennis ball
853: thatch, thatched roof
854: theater curtain, theatre curtain
855: thimble
856: thresher, thrasher, threshing machine
857: throne
858: tile roof
859: toaster
860: tobacco shop, tobacconist shop, tobacconist
861: toilet seat
862: torch
863: totem pole
864: tow truck, tow car, wrecker
865: toyshop
866: tractor
867: trailer truck, tractor trailer, trucking rig, rig, articulated lorry,
semi
868: tray
869: trench coat
870: tricycle, trike, velocipede
871: trimaran
872: tripod
873: triumphal arch
874: trolleybus, trolley coach, trackless trolley
875: trombone
876: tub, vat
877: turnstile
878: typewriter keyboard
879: umbrella
880: unicycle, monocycle
881: upright, upright piano
882: vacuum, vacuum cleaner
883: vase
884: vault
885: velvet
886: vending machine
887: vestment
888: viaduct
889: violin, fiddle
890: volleyball
891: waffle iron
892: wall clock
893: wallet, billfold, notecase, pocketbook
894: wardrobe, closet, press
895: warplane, military plane
896: washbasin, handbasin, washbowl, lavabo, wash-hand basin
897: washer, automatic washer, washing machine
898: water bottle
899: water jug
900: water tower
901: whiskey jug
902: whistle
903: wig
904: window screen
905: window shade
906: Windsor tie
907: wine bottle
908: wing
909: wok
910: wooden spoon
911: wool, woolen, woollen
912: worm fence, snake fence, snake-rail fence, Virginia fence
913: wreck
914: yawl
915: yurt
916: web site, website, internet site, site
917: comic book
918: crossword puzzle, crossword
919: street sign
920: traffic light, traffic signal, stoplight
921: book jacket, dust cover, dust jacket, dust wrapper
922: menu
923: plate
924: guacamole
925: consomme
926: hot pot, hotpot
927: trifle
928: ice cream, icecream
929: ice lolly, lolly, lollipop, popsicle
930: French loaf
931: bagel, beigel
932: pretzel
933: cheeseburger
934: hotdog, hot dog, red hot
935: mashed potato
936: head cabbage
937: broccoli
938: cauliflower
939: zucchini, courgette
940: spaghetti squash
941: acorn squash
942: butternut squash
943: cucumber, cuke
944: artichoke, globe artichoke
945: bell pepper
946: cardoon
947: mushroom
948: Granny Smith
949: strawberry
950: orange
951: lemon
952: fig
953: pineapple, ananas
954: banana
955: jackfruit, jak, jack
956: custard apple
957: pomegranate
958: hay
959: carbonara
960: chocolate sauce, chocolate syrup
961: dough
962: meat loaf, meatloaf
963: pizza, pizza pie
964: potpie
965: burrito
966: red wine
967: espresso
968: cup
969: eggnog
970: alp
971: bubble
972: cliff, drop, drop-off
973: coral reef
974: geyser
975: lakeside, lakeshore
976: promontory, headland, head, foreland
977: sandbar, sand bar
978: seashore, coast, seacoast, sea-coast
979: valley, vale
980: volcano
981: ballplayer, baseball player
982: groom, bridegroom
983: scuba diver
984: rapeseed
985: daisy
986: yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus,
Cypripedium parviflorum
987: corn
988: acorn
989: hip, rose hip, rosehip
990: buckeye, horse chestnut, conker
991: coral fungus
992: agaric
993: gyromitra
994: stinkhorn, carrion fungus
995: earthstar
996: hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa
997: bolete
998: ear, spike, capitulum
999: toilet tissue, toilet paper, bathroom tissue
splits:
- name: train
num_bytes: 153448487293.13168
num_examples: 1265871
- name: validation
num_bytes: 14250116434.46592
num_examples: 98598
download_size: 13250925336
dataset_size: 167698603727.5976
---
# Description
The `vl-imagenet-1k` is a sanitized version of the original ImageNet-1K dataset.
The following are issues found in the original dataset and removed in this dataset:
<table>
<thead>
<tr>
<th align="left">Category</th>
<th align="left">Percentage</th>
<th align="left">Count</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Duplicates</td>
<td align="left"><div align="left">0.57%</td>
<td align="left"><div align="left">7,522</td>
</tr>
<tr>
<td align="left">Outliers</td>
<td align="left"><div align="left">0.09%</td>
<td align="left"><div align="left">1,199</td>
</tr>
<tr>
<td align="left">Blur</td>
<td align="left"><div align="left">0.19%</td>
<td align="left"><div align="left">2,478</td>
</tr>
<tr>
<td align="left">Dark</td>
<td align="left"><div align="left">0.24%</td>
<td align="left"><div align="left">3,174</td>
</tr>
<tr>
<td align="left">Bright</td>
<td align="left"><div align="left">0.06%</td>
<td align="left"><div align="left">770</td>
</tr>
<tr>
<td align="left">Mislabels</td>
<td align="left"><div align="left">0.11%</td>
<td align="left"><div align="left">1,480</td>
</tr>
<tr>
<td align="left">Leakage</td>
<td align="left"><div align="left">0.065%</td>
<td align="left"><div align="left">869</td>
</tr>
<tr>
<td align="left"><strong>Total</strong></td>
<td align="left"><div align="left"><strong>1.313%</strong></td>
<td align="left"><div align="left"><strong>17,492</strong></td>
</tr>
</tbody>
</table>
Learn more - https://docs.visual-layer.com/docs/available-datasets#vl-imagenet-1k
# About Visual-Layer
<div align="center">
<a href="https://www.visual-layer.com">
<img alt="Visual Layer Logo" src="https://github.com/visual-layer/visuallayer/blob/main/imgs/vl_horizontal_logo.png?raw=true" alt="Logo" width="400">
</a>
</div>
Visual Layer is founded by the authors of [XGBoost](https://github.com/apache/tvm), [Apache TVM](https://github.com/apache/tvm) & [Turi Create](https://github.com/apple/turicreate) - [Danny Bickson](https://www.linkedin.com/in/dr-danny-bickson-835b32), [Carlos Guestrin](https://www.linkedin.com/in/carlos-guestrin-5352a869) and [Amir Alush](https://www.linkedin.com/in/amiralush).
Learn more about Visual Layer [here](https://visual-layer.com). |
pminervini/hl-fever | ---
license: mit
dataset_info:
- config_name: default
features:
- name: id
dtype: int64
- name: label
dtype: string
- name: claim
dtype: string
splits:
- name: train
num_bytes: 4212783
num_examples: 59550
- name: dev
num_bytes: 959596
num_examples: 13332
download_size: 3105453
dataset_size: 5172379
- config_name: v1.0
features:
- name: id
dtype: int64
- name: label
dtype: string
- name: claim
dtype: string
splits:
- name: train
num_bytes: 4242558
num_examples: 59550
- name: dev
num_bytes: 966262
num_examples: 13332
download_size: 3105766
dataset_size: 5208820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- config_name: v1.0
data_files:
- split: train
path: v1.0/train-*
- split: dev
path: v1.0/dev-*
---
|
autoevaluate/autoeval-staging-eval-squad_v2-squad_v2-c76793-16626245 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: Adrian/distilbert-base-uncased-finetuned-squad-colab
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Adrian/distilbert-base-uncased-finetuned-squad-colab
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
SINAI/spanish-acronyms-pubmed | ---
language:
- es
pretty_name: Spcialty-based sense inventory for Spanish clinical acronym resolution
tags:
- clinical
- medical
license: cc-by-nc-sa-4.0
---
# Leveraging pubmed to create a specialty-based sense inventory for spanish acronym resolution
Acronyms frequently occur in clinical text, which makes their identification, disambiguation and resolution an important task in clinical natural language processing. This paper contributes to acronym resolution in Spanish through the creation of a set of sense inventories organized by clinical specialty containing acronyms, their expansions, and corpus-driven features. The new acronym resource is composed of 51 clinical specialties with 3,603 acronyms in total, from which we identified 228 language independent acronyms and 391 language dependent expansions. We further analyzed the sense inventory across specialties and present novel insights of acronym usage in biomedical Spanish texts.
# Authors
- Alexandra Pomares-Quimbaya
- Pilar López-Úbeda
- Michel Oleynik
- Stefan Schulz
# Citing
If you use the lexicon in your research, please cite: [Leveraging PubMed to Create a Specialty-Based Sense Inventory for Spanish Acronym Resolution](https://ebooks.iospress.nl/volumearticle/54171).
```
@incollection{pomares2020leveraging,
title={Leveraging pubmed to create a specialty-based sense inventory for spanish acronym resolution},
author={Pomares-Quimbaya, Alexandra and L{\'o}pez-{\'U}beda, Pilar and Oleynik, Michel and Schulz, Stefan},
booktitle={Digital Personalized Health and Medicine},
pages={292--296},
year={2020},
publisher={IOS Press}
}
``` |
FelixChau/ArchiveFrench | ---
license: apache-2.0
---
|
kunwarsaaim/AntiBiasDataset | ---
license: mit
---
# Dataset from the paper [Debiasing Pre-Trained Language Models via Efficient Fine-Tuning](https://aclanthology.org/2022.ltedi-1.8/)
------------------------
The dataset is formed by combining two different datasets: [WinoBias](https://github.com/uclanlp/corefBias) and [CrowS-Pairs](https://github.com/nyu-mll/crows-pairs) |
jaozindacdd/chiquinho | ---
license: openrail
---
|
Mitsuki-Sakamoto/alpaca_farm-alpaca_gpt4_preference-re-preference_test | ---
dataset_info:
- config_name: opt-1.3b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 326835
num_examples: 194
download_size: 218330
dataset_size: 326835
- config_name: opt-1.3b_alpaca_farm_instructions_sft_constant-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 521048
num_examples: 194
download_size: 311073
dataset_size: 521048
- config_name: pythia-1.3b-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 340641
num_examples: 194
download_size: 229970
dataset_size: 340641
- config_name: pythia-1.3b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
splits:
- name: preference
num_bytes: 333120
num_examples: 194
download_size: 213247
dataset_size: 333120
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_constant-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 487413
num_examples: 194
download_size: 314679
dataset_size: 487413
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_constant_slow-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 537218
num_examples: 194
download_size: 319560
dataset_size: 537218
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_constant_slow_w_peft-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 517341
num_examples: 194
download_size: 320773
dataset_size: 517341
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_slow-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 363613
num_examples: 194
download_size: 229405
dataset_size: 363613
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_wo_peft-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 381652
num_examples: 194
download_size: 241724
dataset_size: 381652
- config_name: pythia-1.4b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 447991
num_examples: 194
download_size: 271136
dataset_size: 447991
- config_name: pythia-1B-response-full-static-sft-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 162223
num_examples: 194
download_size: 110142
dataset_size: 162223
- config_name: pythia-1B-static-sft-reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
- name: old_output_1
dtype: string
- name: old_output_2
dtype: string
splits:
- name: preference
num_bytes: 120611
num_examples: 194
download_size: 83257
dataset_size: 120611
- config_name: reward-model-deberta-v3-large-v2
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
splits:
- name: preference
num_bytes: 113541
num_examples: 194
download_size: 76166
dataset_size: 113541
configs:
- config_name: opt-1.3b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: opt-1.3b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2/preference-*
- config_name: opt-1.3b_alpaca_farm_instructions_sft_constant-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: opt-1.3b_alpaca_farm_instructions_sft_constant-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.3b-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.3b-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.3b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.3b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_constant-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.3b_alpaca_farm_instructions_sft_constant-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_constant_slow-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.3b_alpaca_farm_instructions_sft_constant_slow-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_constant_slow_w_peft-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.3b_alpaca_farm_instructions_sft_constant_slow_w_peft-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_slow-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.3b_alpaca_farm_instructions_sft_slow-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.3b_alpaca_farm_instructions_sft_wo_peft-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.3b_alpaca_farm_instructions_sft_wo_peft-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1.4b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1.4b_alpaca_farm_instructions_sft-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1B-response-full-static-sft-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1B-response-full-static-sft-reward-model-deberta-v3-large-v2/preference-*
- config_name: pythia-1B-static-sft-reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: pythia-1B-static-sft-reward-model-deberta-v3-large-v2/preference-*
- config_name: reward-model-deberta-v3-large-v2
data_files:
- split: preference
path: reward-model-deberta-v3-large-v2/preference-*
---
|
open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1 | ---
pretty_name: Evaluation run of vihangd/smartsolmix-4x10.7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vihangd/smartsolmix-4x10.7b-v1](https://huggingface.co/vihangd/smartsolmix-4x10.7b-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T08:52:41.511718](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1/blob/main/results_2024-01-05T08-52-41.511718.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6617320765919033,\n\
\ \"acc_stderr\": 0.031609550329954696,\n \"acc_norm\": 0.6640092521869942,\n\
\ \"acc_norm_stderr\": 0.032248539123169905,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5503302799032582,\n\
\ \"mc2_stderr\": 0.015375535036682436\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946707,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.660426209918343,\n\
\ \"acc_stderr\": 0.004725967684806407,\n \"acc_norm\": 0.8513244373630751,\n\
\ \"acc_norm_stderr\": 0.003550412891647448\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777028,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777028\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822513,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822513\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n\
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595698,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595698\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.015839400406212494,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.015839400406212494\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008553,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008553\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49674054758800523,\n\
\ \"acc_stderr\": 0.012769964760343309,\n \"acc_norm\": 0.49674054758800523,\n\
\ \"acc_norm_stderr\": 0.012769964760343309\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887664,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887664\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904017,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904017\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587952,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587952\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5503302799032582,\n\
\ \"mc2_stderr\": 0.015375535036682436\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.01045089954537063\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5943896891584534,\n \
\ \"acc_stderr\": 0.013524848894462115\n }\n}\n```"
repo_url: https://huggingface.co/vihangd/smartsolmix-4x10.7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|arc:challenge|25_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|gsm8k|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hellaswag|10_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T08-52-41.511718.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- '**/details_harness|winogrande|5_2024-01-05T08-52-41.511718.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T08-52-41.511718.parquet'
- config_name: results
data_files:
- split: 2024_01_05T08_52_41.511718
path:
- results_2024-01-05T08-52-41.511718.parquet
- split: latest
path:
- results_2024-01-05T08-52-41.511718.parquet
---
# Dataset Card for Evaluation run of vihangd/smartsolmix-4x10.7b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vihangd/smartsolmix-4x10.7b-v1](https://huggingface.co/vihangd/smartsolmix-4x10.7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T08:52:41.511718](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1/blob/main/results_2024-01-05T08-52-41.511718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6617320765919033,
"acc_stderr": 0.031609550329954696,
"acc_norm": 0.6640092521869942,
"acc_norm_stderr": 0.032248539123169905,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5503302799032582,
"mc2_stderr": 0.015375535036682436
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946707,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.660426209918343,
"acc_stderr": 0.004725967684806407,
"acc_norm": 0.8513244373630751,
"acc_norm_stderr": 0.003550412891647448
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777028,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777028
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822513,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822513
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189946,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595698,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595698
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.015839400406212494,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.015839400406212494
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008553,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008553
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49674054758800523,
"acc_stderr": 0.012769964760343309,
"acc_norm": 0.49674054758800523,
"acc_norm_stderr": 0.012769964760343309
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887664,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887664
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904017,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904017
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587952,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587952
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5503302799032582,
"mc2_stderr": 0.015375535036682436
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.01045089954537063
},
"harness|gsm8k|5": {
"acc": 0.5943896891584534,
"acc_stderr": 0.013524848894462115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
arieg/bw_spec_cls_4_17_s_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1644'
'1': '1649'
'2': '1661'
'3': '1663'
splits:
- name: train
num_bytes: 43937841.0
num_examples: 800
- name: test
num_bytes: 1084667.0
num_examples: 20
download_size: 39034892
dataset_size: 45022508.0
---
# Dataset Card for "bw_spec_cls_4_17_s_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Food101_test_google_flan_t5_small_mode_T_SPECIFIC_A_ns_25250 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_descriptors_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 10964309
num_examples: 25250
download_size: 0
dataset_size: 10964309
---
# Dataset Card for "Food101_test_google_flan_t5_small_mode_T_SPECIFIC_A_ns_25250"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
learn3r/summ_screen_fd_bp | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 119519799
num_examples: 3673
- name: validation
num_bytes: 10838812
num_examples: 338
- name: test
num_bytes: 11004410
num_examples: 337
download_size: 6435842
dataset_size: 141363021
---
# Dataset Card for "summ_screen_fd_bp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anan-2024/twitter_dataset_1712997290 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 367364
num_examples: 984
download_size: 193066
dataset_size: 367364
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bathrobe/safe-statworx-haiku | ---
license: apache-2.0
---
|
lewtun/splits-test | ---
dataset_info:
features:
- name: a
dtype: int64
splits:
- name: foo
num_bytes: 24
num_examples: 3
- name: bar
num_bytes: 24
num_examples: 3
- name: baz
num_bytes: 24
num_examples: 3
download_size: 1737
dataset_size: 72
---
# Dataset Card for "splits-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.