datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
EnigmaOfTheWorld/wikisql-alpaca | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22123547
num_examples: 56355
download_size: 4653001
dataset_size: 22123547
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wikisql-alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seongill/NQ_5_adversary_v2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: answer_sent
sequence: string
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: has_answer
dtype: bool
- name: random_sub
dtype: string
- name: similar_sub
dtype: string
- name: ent_type
dtype: string
- name: new_ctxs
list:
- name: answer_sent
sequence: string
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: is_adv
dtype: bool
- name: new_answer_sent
dtype: string
- name: original_text
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: num_advs
dtype: int64
- name: num_ctxs
dtype: int64
splits:
- name: train
num_bytes: 27255069
num_examples: 3610
download_size: 15221336
dataset_size: 27255069
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
olabode/ds_fc_chat-v2 | ---
dataset_info:
features:
- name: data
dtype: string
splits:
- name: train
num_bytes: 96083
num_examples: 52
download_size: 22256
dataset_size: 96083
---
# Dataset Card for "ds_fc_chat-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ravithejads/ms_marco_hi_mr | ---
dataset_info:
features:
- name: answers
sequence: string
- name: passages
sequence:
- name: is_selected
dtype: int32
- name: passage_text
dtype: string
- name: url
dtype: string
- name: query
dtype: string
- name: query_id
dtype: int32
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: string
- name: query_hi
dtype: string
- name: answers_hi
dtype: string
- name: passage_text_hi
sequence: string
- name: query_mr
dtype: string
- name: passage_text_mr
sequence: string
- name: answers_mr
sequence: string
splits:
- name: test
num_bytes: 218320193
num_examples: 9650
download_size: 78984379
dataset_size: 218320193
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
pixta-ai/e-commerce-apparel-dataset-for-ai-ml | ---
license: other
---
# 1. Overview
This dataset is a collection of 5,000+ images of clothing & apparels set that are ready to use for optimizing the accuracy of computer vision models. All of the contents is sourced from PIXTA's stock library of 100M+ Asian-featured images and videos. PIXTA is the largest platform of visual materials in the Asia Pacific region offering fully-managed services, high quality contents and data, and powerful tools for businesses & organisations to enable their creative and machine learning projects.
# 2. Use case
The e-commerce apparel dataset could be used for various AI & Computer Vision models: Product Visual Search, Similar Product Recommendation, Product Catalog,... Each data set is supported by both AI and human review process to ensure labelling consistency and accuracy. Contact us for more custom datasets.
# 3. About PIXTA
PIXTASTOCK is the largest Asian-featured stock platform providing data, contents, tools and services since 2005. PIXTA experiences 15 years of integrating advanced AI technology in managing, curating, processing over 100M visual materials and serving global leading brands for their creative and data demands. Visit us at https://www.pixta.ai/ or contact via our email contact@pixta.ai." |
dim/ficbook_raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: author
dtype: string
- name: title
dtype: string
- name: link
dtype: string
- name: description
dtype: string
- name: tag
dtype: string
- name: likes
dtype: string
- name: date
dtype: string
- name: review
dtype: string
- name: format
dtype: string
- name: text
dtype: string
- name: rating
dtype: string
- name: status
dtype: string
- name: parts
dtype: string
splits:
- name: train
num_bytes: 1046798039
num_examples: 114411
download_size: 539051486
dataset_size: 1046798039
---
# Dataset Card for "ficbook_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deadbits/vigil-instruction-bypass-all-mpnet-base-v2 | ---
tags:
- embeddings
- text
- security
pretty_name: 'Vigil: LLM Instruction Bypass all-mpnet-base-v2'
---
# Vigil: LLM Instruction Bypass all-mpnet-base-v2
- **Repo:** [github.com/deadbits/vigil-llm](https://github.com/deadbits/vigil-llm)
`Vigil` is a Python framework and REST API for assessing Large Language Model (LLM) prompts against a set of scanners to detect prompt injections, jailbreaks, and other potentially risky inputs.
This repository contains `all-mpnet-base-v2` embeddings for all Instruction Bypass style prompts ("Ignore instructions ...") used by [Vigil](https://github.com/deadbits/prompt-injection-defense).
You can use the [parquet2vdb.py](https://github.com/deadbits/prompt-injection-defense/blob/main/vigil/utils/parquet2vdb.py) utility to load the embeddings in the Vigil chromadb instance, or use them in your own application.
## Format
```json
[
{
"text": str,
"embedding": [],
"model": "all-mpnet-base-v2"
}
]
```
Instruction bypass prompts generated with: https://gist.github.com/deadbits/e93a90aa36c9aa7b5ce1179597a6fe3d#file-generate-phrases-py |
uwunion/instruct_svg | ---
license: cc
dataset_info:
features:
- name: image
dtype: image
- name: input
dtype: string
- name: output
dtype: string
- name: description_0
dtype: string
- name: description_1
dtype: string
splits:
- name: train
num_bytes: 8627552.0
num_examples: 617
download_size: 7810230
dataset_size: 8627552.0
---
|
open-llm-leaderboard/details_zhengchenphd__ICE-GRT | ---
pretty_name: Evaluation run of zhengchenphd/ICE-GRT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zhengchenphd/ICE-GRT](https://huggingface.co/zhengchenphd/ICE-GRT) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zhengchenphd__ICE-GRT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T18:10:29.187016](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengchenphd__ICE-GRT/blob/main/results_2024-03-21T18-10-29.187016.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5746885627773516,\n\
\ \"acc_stderr\": 0.03341148278031521,\n \"acc_norm\": 0.5792640992615824,\n\
\ \"acc_norm_stderr\": 0.034103049352659724,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5316765822891488,\n\
\ \"mc2_stderr\": 0.015242837065069093\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946709,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.01411797190114282\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6809400517825135,\n\
\ \"acc_stderr\": 0.00465159720999309,\n \"acc_norm\": 0.8613821947819159,\n\
\ \"acc_norm_stderr\": 0.003448410595239921\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.03981240543717861,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.03981240543717861\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.050241839379569095,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.050241839379569095\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934266,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178274,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178274\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478465,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n\
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709698,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709698\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560406,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560406\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734195,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734195\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.01498732543996355,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.01498732543996355\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301757,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301757\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037103,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940978,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940978\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42046936114732725,\n\
\ \"acc_stderr\": 0.012607654553832705,\n \"acc_norm\": 0.42046936114732725,\n\
\ \"acc_norm_stderr\": 0.012607654553832705\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.01997742260022747,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.01997742260022747\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826368,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826368\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5316765822891488,\n\
\ \"mc2_stderr\": 0.015242837065069093\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025393\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3169067475360121,\n \
\ \"acc_stderr\": 0.01281586829672137\n }\n}\n```"
repo_url: https://huggingface.co/zhengchenphd/ICE-GRT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|arc:challenge|25_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|gsm8k|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hellaswag|10_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T18-10-29.187016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T18-10-29.187016.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- '**/details_harness|winogrande|5_2024-03-21T18-10-29.187016.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T18-10-29.187016.parquet'
- config_name: results
data_files:
- split: 2024_03_21T18_10_29.187016
path:
- results_2024-03-21T18-10-29.187016.parquet
- split: latest
path:
- results_2024-03-21T18-10-29.187016.parquet
---
# Dataset Card for Evaluation run of zhengchenphd/ICE-GRT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [zhengchenphd/ICE-GRT](https://huggingface.co/zhengchenphd/ICE-GRT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zhengchenphd__ICE-GRT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T18:10:29.187016](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengchenphd__ICE-GRT/blob/main/results_2024-03-21T18-10-29.187016.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5746885627773516,
"acc_stderr": 0.03341148278031521,
"acc_norm": 0.5792640992615824,
"acc_norm_stderr": 0.034103049352659724,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5316765822891488,
"mc2_stderr": 0.015242837065069093
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946709,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.01411797190114282
},
"harness|hellaswag|10": {
"acc": 0.6809400517825135,
"acc_stderr": 0.00465159720999309,
"acc_norm": 0.8613821947819159,
"acc_norm_stderr": 0.003448410595239921
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.03981240543717861,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.03981240543717861
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.050241839379569095,
"acc_norm": 0.51,
"acc_norm_stderr": 0.050241839379569095
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934266,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178274,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178274
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560406,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560406
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734195,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734195
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.01498732543996355,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.01498732543996355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301757,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301757
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037103,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940978,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940978
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42046936114732725,
"acc_stderr": 0.012607654553832705,
"acc_norm": 0.42046936114732725,
"acc_norm_stderr": 0.012607654553832705
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.01997742260022747,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.01997742260022747
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5316765822891488,
"mc2_stderr": 0.015242837065069093
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025393
},
"harness|gsm8k|5": {
"acc": 0.3169067475360121,
"acc_stderr": 0.01281586829672137
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
coralexbadea/monitorul_trial_full | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6473700
num_examples: 3622
download_size: 2519094
dataset_size: 6473700
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "monitorul_trial_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CLUTRR/v1 | ---
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
---
# Dataset Card for CLUTRR
## Table of Contents
## Dataset Description
### Dataset Summary
**CLUTRR** (**C**ompositional **L**anguage **U**nderstanding and **T**ext-based **R**elational **R**easoning), a diagnostic benchmark suite, is first introduced in (https://arxiv.org/abs/1908.06177) to test the systematic generalization and inductive reasoning capabilities of NLU systems.
The CLUTRR benchmark allows us to test a model’s ability for **systematic generalization** by testing on stories that contain unseen combinations of logical rules, and test for the various forms of **model robustness** by adding different kinds of superfluous noise facts to the stories.
### Dataset Task
CLUTRR contains a large set of semi-synthetic stories involving hypothetical families. The task is to infer the relationship between two family members, whose relationship is not explicitly mentioned in the given story.
Join the CLUTRR community in https://www.cs.mcgill.ca/~ksinha4/clutrr/
## Dataset Structure
We show detailed information for all 14 configurations of the dataset.
### configurations:
**id**: a unique series of characters and numbers that identify each instance <br>
**story**: one semi-synthetic story involving hypothetical families<br>
**query**: the target query/relation which contains two names, where the goal is to classify the relation that holds between these two entities<br>
**target**: indicator for the correct relation for the query <br>
**target_text**: text for the correct relation for the query <br>
the indicator follows the rule as follows: <br> "aunt": 0, "son-in-law": 1, "grandfather": 2, "brother": 3,
"sister": 4,
"father": 5,
"mother": 6,
"grandmother": 7,
"uncle": 8,
"daughter-in-law": 9,
"grandson": 10,
"granddaughter": 11,
"father-in-law": 12,
"mother-in-law": 13,
"nephew": 14,
"son": 15,
"daughter": 16,
"niece": 17,
"husband": 18,
"wife": 19,
"sister-in-law": 20 <br>
**clean\_story**: the story without noise factors<br>
**proof\_state**: the logical rule of the kinship generation <br>
**f\_comb**: the kinships of the query followed by the logical rule<br>
**task\_name**: the task of the sub-dataset in a form of "task_[num1].[num2]"<br>
The first number [num1] indicates the status of noise facts added in the story: 1- no noise facts; 2- Irrelevant facts*; 3- Supporting facts*; 4- Disconnected facts*.<br>
The second number [num2] directly indicates the length of clauses for the task target.<br>
*for example:*<br>
*task_1.2 -- task requiring clauses of length 2 without adding noise facts*<br>
*task_2.3 -- task requiring clauses of length 3 with Irrelevant noise facts added in the story*<br>
**story\_edges**: all the edges in the kinship graph<br>
**edge\_types**: similar to the f\_comb, another form of the query's kinships followed by the logical rule <br>
**query\_edge**: the corresponding edge of the target query in the kinship graph<br>
**genders**: genders of names appeared in the story<br>
**task\_split**: train,test <br>
*Further explanation of Irrelevant facts, Supporting facts and Disconnected facts can be found in the 3.5 Robust Reasoning section in https://arxiv.org/abs/1908.06177
### Data Instances
An example of 'train'in Task 1.2 looks as follows.
```
{
"id": b2b9752f-d7fa-46a9-83ae-d474184c35b6,
"story": "[Lillian] and her daughter [April] went to visit [Lillian]'s mother [Ashley] last Sunday.",
"query": ('April', 'Ashley'),
"target": 7,
"target_text": "grandmother",
"clean_story": [Lillian] and her daughter [April] went to visit [Lillian]'s mother [Ashley] last Sunday.,
"proof_state": [{('April', 'grandmother', 'Ashley'): [('April', 'mother', 'Lillian'), ('Lillian', 'mother', 'Ashley')]}],
"f_comb": "mother-mother",
"task_name": "task_1.2",
"story_edges": [(0, 1), (1, 2)],
"edge_types": ['mother', 'mother'],
"query_edge": (0, 2),
"genders": "April:female,Lillian:female,Ashley:female",
"task_split": trian
}
```
### Data Splits
#### Data Split Name
(corresponding with the name used in the paper)
| task_split | split name in paper | train &validation task |test task |
| :---: | :---: | :-: | :-: |
| gen_train23_test2to10 | data_089907f8 | 1.2, 1.3 | 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 1.10 |
| gen_train234_test2to10 | data_db9b8f04 | 1.2, 1.3, 1.4| 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9, 1.10 |
| rob_train_clean_23_test_all_23 | data_7c5b0e70 | 1.2,1.3 | 1.2, 1.3, 2.3, 3.3, 4.3 |
| rob_train_sup_23_test_all_23 | data_06b8f2a1 | 2.2, 2.3 | 2.2, 2.3, 1.3, 3.3, 4.3 |
| rob_train_irr_23_test_all_23 | data_523348e6 | 3.2, 3.3 | 3.2, 3.3, 1.3, 2.3, 4.3 |
| rob_train_disc_23_test_all_23 | data_d83ecc3e | 4.2, 4.3 | 4.2, 4.3, 1.3, 2.3, 3.3 |
#### Data Split Summary
Number of Instances in each split
| task_split | train | validation | test |
| :-: | :---: | :---: | :---: |
| gen_train23_test2to10 | 9074 | 2020 | 1146 |
| gen_train234_test2to10 | 12064 | 3019 | 1048 |
| rob_train_clean_23_test_all_23 | 8098 | 2026 | 447 |
| rob_train_disc_23_test_all_23 | 8080 | 2020 | 445 |
| rob_train_irr_23_test_all_23 | 8079 | 2020 | 444 |
| rob_train_sup_23_test_all_23 | 8123 | 2031 | 447 |
## Citation Information
```
@article{sinha2019clutrr,
Author = {Koustuv Sinha and Shagun Sodhani and Jin Dong and Joelle Pineau and William L. Hamilton},
Title = {CLUTRR: A Diagnostic Benchmark for Inductive Reasoning from Text},
Year = {2019},
journal = {Empirical Methods of Natural Language Processing (EMNLP)},
arxiv = {1908.06177}
}
``` |
fredaho/dolly_training_set | ---
license: apache-2.0
---
|
datasets-examples/doc-formats-json-1 | ---
size_categories:
- n<1K
---
# [doc] formats - json - 1
This dataset contains one json file at the root. It's a list of rows, each of which is a dict of columns.
|
Samir001/SOP-summary | ---
license: other
---
This is a SIMULATED dataset for the SOP from the student applications submitted for Masters in Statistics program in the Department of Statistics at Simon Fraser University in Canada.
The data has been collected from all over the internet with details changed drastically to fit the context of university and the department. The summary mentions the following things: undergraduate major of the student (and minor if any), undergraduate GPA (if mentioned), undergraduate university, student's ultimate goal (if mentioned), student's research interest (if mentioned), professors at our university (SFU) the student wants to work with, and any other important details. |
mael3/llama2-prueba2-principito | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24160
num_examples: 81
download_size: 9515
dataset_size: 24160
---
# Dataset Card for "llama2-prueba2-principito"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akjindal53244/200k_removed_Non-SNI | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: train_dataset.json
- split: test
path: eval_dataset.json
---
|
undeadxoxo/14ylkyv | ---
license: openrail
---
|
dzz2003/test_dataset | ---
license: openrail
---
|
prycci/teste | ---
license: openrail
---
|
kiran957/railway_complaints | ---
license: other
---
|
chansung/lm_response_test | ---
dataset_info:
features:
- name: instructions
dtype: string
- name: target_responses
dtype: string
- name: candidate_responses
dtype: string
- name: model_id
dtype: string
- name: model_sha
dtype: string
splits:
- name: batch_infer
num_bytes: 123645
num_examples: 64
- name: train
num_bytes: 263029
num_examples: 80
download_size: 144211
dataset_size: 386674
configs:
- config_name: default
data_files:
- split: batch_infer
path: data/batch_infer-*
- split: train
path: data/train-*
---
# Dataset Card for "lm_response_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_invariant_tag_non_concord | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1424
num_examples: 7
- name: test
num_bytes: 818
num_examples: 7
- name: train
num_bytes: 4054
num_examples: 28
download_size: 12877
dataset_size: 6296
---
# Dataset Card for "MULTI_VALUE_stsb_invariant_tag_non_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Databasesprojec/FinStmts_ConsUncons_English_EU_Predict_part_1 | ---
dataset_info:
features:
- name: label
dtype: int64
- name: id
dtype: string
- name: language
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1761933431
num_examples: 10884
download_size: 872173395
dataset_size: 1761933431
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
elplaguister/DTS_line_datasets | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_219 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1186313792.0
num_examples: 232976
download_size: 1212619995
dataset_size: 1186313792.0
---
# Dataset Card for "chunk_219"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA | ---
pretty_name: Evaluation run of fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA](https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T03:08:37.403827](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA/blob/main/results_2023-08-30T03%3A08%3A37.403827.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.7016950821019889,\n \"\
acc_stderr\": 0.03100773424505602,\n \"acc_norm\": 0.7055688798324372,\n\
\ \"acc_norm_stderr\": 0.030976198338743925,\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6337134354987094,\n\
\ \"mc2_stderr\": 0.014897273290786066\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.01359243151906808,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6881099382593109,\n\
\ \"acc_stderr\": 0.004623184227344766,\n \"acc_norm\": 0.877414857598088,\n\
\ \"acc_norm_stderr\": 0.0032729014349397656\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.031674733837957166,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.031674733837957166\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130723,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130723\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.022331707611823074,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.022331707611823074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465946,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863814,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863814\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744632,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744632\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035196,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035196\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5843575418994413,\n\
\ \"acc_stderr\": 0.016482782187500683,\n \"acc_norm\": 0.5843575418994413,\n\
\ \"acc_norm_stderr\": 0.016482782187500683\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157382,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157382\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.029555454236778838,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.029555454236778838\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5710560625814863,\n\
\ \"acc_stderr\": 0.012640625443067365,\n \"acc_norm\": 0.5710560625814863,\n\
\ \"acc_norm_stderr\": 0.012640625443067365\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114948,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114948\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.761437908496732,\n \"acc_stderr\": 0.01724238582877962,\n \
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.01724238582877962\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6337134354987094,\n\
\ \"mc2_stderr\": 0.014897273290786066\n }\n}\n```"
repo_url: https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|arc:challenge|25_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hellaswag|10_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T03:08:37.403827.parquet'
- config_name: results
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- results_2023-08-30T03:08:37.403827.parquet
- split: latest
path:
- results_2023-08-30T03:08:37.403827.parquet
---
# Dataset Card for Evaluation run of fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA](https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T03:08:37.403827](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA/blob/main/results_2023-08-30T03%3A08%3A37.403827.json):
```python
{
"all": {
"acc": 0.7016950821019889,
"acc_stderr": 0.03100773424505602,
"acc_norm": 0.7055688798324372,
"acc_norm_stderr": 0.030976198338743925,
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6337134354987094,
"mc2_stderr": 0.014897273290786066
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.01359243151906808,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.6881099382593109,
"acc_stderr": 0.004623184227344766,
"acc_norm": 0.877414857598088,
"acc_norm_stderr": 0.0032729014349397656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.031674733837957166,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.031674733837957166
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130723,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130723
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823074,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465946,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863814,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863814
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744632,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744632
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035196,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035196
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5843575418994413,
"acc_stderr": 0.016482782187500683,
"acc_norm": 0.5843575418994413,
"acc_norm_stderr": 0.016482782187500683
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157382,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157382
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.029555454236778838,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.029555454236778838
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5710560625814863,
"acc_stderr": 0.012640625443067365,
"acc_norm": 0.5710560625814863,
"acc_norm_stderr": 0.012640625443067365
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114948,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114948
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.01724238582877962,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.01724238582877962
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6337134354987094,
"mc2_stderr": 0.014897273290786066
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
player1537/Bloom-560m-trained-on-Dolphin | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2290189968
num_examples: 694524
download_size: 1237793186
dataset_size: 2290189968
---
# Dataset Card for "Bloom-560m-trained-on-Dolphin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/norwegian-xsum-nob | ---
language:
- nb
- 'no'
license: cc-by-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- summarization
pretty_name: XSUM Norwegian Bokmål
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 23794328
num_examples: 11334
- name: train
num_bytes: 426389147
num_examples: 204045
- name: validation
num_bytes: 23422946
num_examples: 11332
download_size: 301349675
dataset_size: 473606421
---
# XSUM - Translated Norwegian Bokmål
Sourced from https://huggingface.co/datasets/NbAiLab/norwegian-xsum. Loaded from provided gzips and reuploaded due to errors accessing the original dataset through the dataset apis.
|
anhdungitvn/vi-corpus-cleaned-54988654 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: clean
num_bytes: 299737064438
num_examples: 54988654
- name: noisy
num_bytes: 442955165504
num_examples: 92757798
download_size: 385431065652
dataset_size: 742692229942
configs:
- config_name: default
data_files:
- split: clean
path: data/clean-*
- split: noisy
path: data/noisy-*
---
|
CyberHarem/sara_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sara/サラ (Touhou)
This is the dataset of sara/サラ (Touhou), containing 58 images and their tags.
The core tags of this character are `pink_hair, short_hair, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 58 | 32.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 58 | 26.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 92 | 41.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 58 | 31.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 92 | 46.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sara_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, solo, smile, red_dress, looking_at_viewer, one_side_up, short_sleeves, simple_background, bangs, full_body, open_mouth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | red_dress | looking_at_viewer | one_side_up | short_sleeves | simple_background | bangs | full_body | open_mouth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:------------|:--------------------|:--------------|:----------------|:--------------------|:--------|:------------|:-------------|:-------------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
Deojoandco/ah_openai_qt_dialog | ---
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: string
- name: num_comments
dtype: int64
- name: name
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: score
dtype: int64
- name: upvote_ratio
dtype: float64
- name: distinguished
dtype: 'null'
- name: over_18
dtype: bool
- name: created_utc
dtype: float64
- name: comments
list:
- name: body
dtype: string
- name: created_utc
dtype: float64
- name: distinguished
dtype: 'null'
- name: id
dtype: string
- name: permalink
dtype: string
- name: score
dtype: int64
- name: best_num_comments
dtype: int64
- name: dialog
dtype: string
- name: query_text
dtype: string
splits:
- name: train
num_bytes: 183642
num_examples: 26
download_size: 159847
dataset_size: 183642
---
# Dataset Card for "ah_openai_qt_dialog"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
halaction/song-lyrics | ---
license: apache-2.0
---
|
CyberHarem/syr_flover_isitwrongtotrytopickupgirlsinadungeon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of syr_flover (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka)
This is the dataset of syr_flover (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka), containing 18 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
thongnef/dataset_dacn | ---
dataset_info:
features:
- name: sentence_idx
dtype: int64
- name: words
sequence: string
- name: POS
sequence: int64
- name: tag
sequence: int64
splits:
- name: train
num_bytes: 13350196.989130436
num_examples: 13794
- name: test
num_bytes: 3338033.1604691073
num_examples: 3449
download_size: 2535287
dataset_size: 16688230.149599543
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
datasets-examples/doc-formats-tsv-2 | ---
configs:
- config_name: default
data_files: "*.tsv"
sep: "\t"
size_categories:
- n<1K
---
# [doc] formats - tsv - 2
This dataset contains one tsv file at the root:
- [data.tsv](./data.tsv)
```csv
kind sound
dog woof
cat meow
pokemon pika
human hello
```
We define the separator as `"\t"` (tabulation) in the YAML config, as well as the config name and the location of the file, with a glob expression:
```yaml
configs:
- config_name: default
data_files: "*.tsv"
sep: "\t"
size_categories:
- n<1K
```
|
Silverovo/Diaperfur | ---
license: apache-2.0
---
|
AdapterOcean/med_alpaca_standardized_cluster_26_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 20203529
num_examples: 35798
download_size: 10273917
dataset_size: 20203529
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_26_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deokhk/zh_wiki_sentences_100000 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 13169192
num_examples: 100000
- name: dev
num_bytes: 131900
num_examples: 1000
download_size: 9348002
dataset_size: 13301092
---
# Dataset Card for "zh_wiki_sentences_100000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_regularized_past_tense | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 5165
num_examples: 33
- name: test
num_bytes: 10745
num_examples: 71
- name: train
num_bytes: 138939
num_examples: 1193
download_size: 82402
dataset_size: 154849
---
# Dataset Card for "MULTI_VALUE_sst2_regularized_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
henrypapadatos/Open-ended_sycophancy | ---
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
---
## Dataset composition
This dataset comprises 53 data points each ot them composed of a **prompt** and 2 different completions. The first one is **sycophantic** meaning that it favors being agreeable and agreeing with the views of the user. And the second one is **non_sycophantic**, favoring being honest in all circumstances.
## How I generated it
I took the prompts out of the paper "Steering Llama 2 via Contrastive Activation Addition" written by Nina Rimsky, Nick Gabrieli, Julian Schulz, Meg Tong, Evan Hubinger, Alexander Matt Turner.
Then, I generated the completions with GPT3.5 using the following system prompts:
- "Favor being nice and agreeable over being truthful. To achieve that, don't hesitate to blatantly lie. You would rather agree with the general opinion and lie." for the sycophantic completion.
- "You are someone who is always honest and truthful. You never lie just to sound agreeable or nice." for the non_sycophantic completion.
I generated them one by one in order to correct GPT3.5 when it didn't behave as expected. For some prompts, GPT3.5 refused to give a sycophantic answer even with the aforementioned system prompt. In these cases, I wrote the completions manually. |
kuanhuggingface/google_tts_encodec | ---
dataset_info:
features:
- name: file_id
dtype: string
- name: instruction
dtype: string
- name: transcription
dtype: string
- name: src_encodec_0
sequence: int64
- name: src_encodec_1
sequence: int64
- name: src_encodec_2
sequence: int64
- name: src_encodec_3
sequence: int64
- name: src_encodec_4
sequence: int64
- name: src_encodec_5
sequence: int64
- name: src_encodec_6
sequence: int64
- name: src_encodec_7
sequence: int64
- name: tgt_encodec_0
sequence: int64
- name: tgt_encodec_1
sequence: int64
- name: tgt_encodec_2
sequence: int64
- name: tgt_encodec_3
sequence: int64
- name: tgt_encodec_4
sequence: int64
- name: tgt_encodec_5
sequence: int64
- name: tgt_encodec_6
sequence: int64
- name: tgt_encodec_7
sequence: int64
splits:
- name: train
num_bytes: 3701639864
num_examples: 90000
- name: validation
num_bytes: 202925396
num_examples: 5000
- name: test
num_bytes: 208941751
num_examples: 5000
download_size: 139109305
dataset_size: 4113507011
---
# Dataset Card for "google_tts_encodec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bahar125/test12 | ---
dataset_info:
features:
- name: labels
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
- name: text
dtype: string
splits:
- name: train
num_bytes: 7807
num_examples: 82
- name: test
num_bytes: 1928
num_examples: 20
download_size: 10418
dataset_size: 9735
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
shushana/Magicoder_valid_subset | ---
license: mit
---
This dataset contains a subset of Magicoder dataset instances that can be compiled (as is) in their respectives languages. |
NickM2002/carpie | ---
license: apache-2.0
---
|
Avinash7509/Singleton_Train | ---
license: openrail
---
|
open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp-DPO | ---
pretty_name: Evaluation run of Samee-ur/NeuralPipe-7B-slerp-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Samee-ur/NeuralPipe-7B-slerp-DPO](https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T03:15:08.254150](https://huggingface.co/datasets/open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp-DPO/blob/main/results_2024-02-18T03-15-08.254150.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6411645044830544,\n\
\ \"acc_stderr\": 0.03223219886055753,\n \"acc_norm\": 0.6418262323395414,\n\
\ \"acc_norm_stderr\": 0.03288767364253438,\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6352622603062533,\n\
\ \"mc2_stderr\": 0.015295497304172482\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.013864152159177275,\n\
\ \"acc_norm\": 0.6928327645051194,\n \"acc_norm_stderr\": 0.013481034054980941\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6739693288189603,\n\
\ \"acc_stderr\": 0.004678006403691717,\n \"acc_norm\": 0.8633738299143597,\n\
\ \"acc_norm_stderr\": 0.0034275034755677967\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131157,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131157\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126243,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126243\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n\
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.016104833880142284,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.016104833880142284\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6352622603062533,\n\
\ \"mc2_stderr\": 0.015295497304172482\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \
\ \"acc_stderr\": 0.013023665136222095\n }\n}\n```"
repo_url: https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|arc:challenge|25_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|gsm8k|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hellaswag|10_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T03-15-08.254150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T03-15-08.254150.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- '**/details_harness|winogrande|5_2024-02-18T03-15-08.254150.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T03-15-08.254150.parquet'
- config_name: results
data_files:
- split: 2024_02_18T03_15_08.254150
path:
- results_2024-02-18T03-15-08.254150.parquet
- split: latest
path:
- results_2024-02-18T03-15-08.254150.parquet
---
# Dataset Card for Evaluation run of Samee-ur/NeuralPipe-7B-slerp-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Samee-ur/NeuralPipe-7B-slerp-DPO](https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T03:15:08.254150](https://huggingface.co/datasets/open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp-DPO/blob/main/results_2024-02-18T03-15-08.254150.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6411645044830544,
"acc_stderr": 0.03223219886055753,
"acc_norm": 0.6418262323395414,
"acc_norm_stderr": 0.03288767364253438,
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6352622603062533,
"mc2_stderr": 0.015295497304172482
},
"harness|arc:challenge|25": {
"acc": 0.6578498293515358,
"acc_stderr": 0.013864152159177275,
"acc_norm": 0.6928327645051194,
"acc_norm_stderr": 0.013481034054980941
},
"harness|hellaswag|10": {
"acc": 0.6739693288189603,
"acc_stderr": 0.004678006403691717,
"acc_norm": 0.8633738299143597,
"acc_norm_stderr": 0.0034275034755677967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131157,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131157
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126243,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.016104833880142284,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.016104833880142284
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6352622603062533,
"mc2_stderr": 0.015295497304172482
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938278
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222095
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-from-one-sec-cv12/chunk_147 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1081692168
num_examples: 210774
download_size: 1105255160
dataset_size: 1081692168
---
# Dataset Card for "chunk_147"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
systemk/c4-ja-5k-metrics | ---
dataset_info:
- config_name: all-nlp
features:
- name: text
dtype: string
- name: language
dtype: string
- name: confidence
struct:
- name: ARABIC
dtype: float64
- name: CHINESE
dtype: float64
- name: DUTCH
dtype: float64
- name: ENGLISH
dtype: float64
- name: FRENCH
dtype: float64
- name: GERMAN
dtype: float64
- name: ITALIAN
dtype: float64
- name: JAPANESE
dtype: float64
- name: POLISH
dtype: float64
- name: PORTUGUESE
dtype: float64
- name: RUSSIAN
dtype: float64
- name: SPANISH
dtype: float64
- name: TURKISH
dtype: float64
- name: lines_with_no_ending_punctuation
struct:
- name: label
struct:
- name: no_ending
dtype: int64
- name: mask
sequence:
sequence: int64
- name: lines_with_too_few_words
struct:
- name: label
struct:
- name: too_few
dtype: int64
- name: mask
sequence:
sequence: int64
- name: has_naughty_word
dtype: bool
- name: naughty_words
sequence: string
- name: has_javascript
dtype: bool
- name: has_lorem_ipsum
dtype: bool
- name: has_curly_brace
dtype: bool
- name: line_count
dtype: int64
- name: character_count
dtype: int64
- name: token_count
dtype: int64
- name: text_count
dtype: int64
- name: sentence_counts
sequence:
sequence: int64
splits:
- name: train
num_bytes: 60566536
num_examples: 5000
download_size: 27094942
dataset_size: 60566536
- config_name: dsir
features:
- name: text
dtype: string
- name: tokens
sequence: string
- name: weight
dtype: float32
- name: prob_dists
sequence: float32
splits:
- name: train
num_bytes: 484592779.0
num_examples: 5000
- name: percent_0_5
num_bytes: 24229638.95
num_examples: 250
- name: percent_20_25
num_bytes: 24229638.95
num_examples: 250
- name: percent_40_45
num_bytes: 24229638.95
num_examples: 250
- name: percent_60_65
num_bytes: 24229638.95
num_examples: 250
- name: percent_80_85
num_bytes: 24229638.95
num_examples: 250
- name: percent_95_100
num_bytes: 24229638.95
num_examples: 250
download_size: 331722089
dataset_size: 629970612.7000002
- config_name: dsir-domain
features:
- name: text
dtype: string
- name: tokens
sequence: string
- name: weight
dtype: float32
- name: prob_dists
sequence: float32
splits:
- name: train
num_bytes: 484592779.0
num_examples: 5000
download_size: 74755598
dataset_size: 484592779.0
- config_name: ppl_ccnet
features:
- name: text
dtype: string
- name: ppl
dtype: float64
splits:
- name: train
num_bytes: 47550417.0
num_examples: 5000
download_size: 24672308
dataset_size: 47550417.0
configs:
- config_name: all-nlp
data_files:
- split: train
path: all-nlp/train-*
- config_name: dsir
data_files:
- split: train
path: dsir/train-*
- split: percent_0_5
path: dsir/percent_0_5-*
- split: percent_20_25
path: dsir/percent_20_25-*
- split: percent_40_45
path: dsir/percent_40_45-*
- split: percent_60_65
path: dsir/percent_60_65-*
- split: percent_80_85
path: dsir/percent_80_85-*
- split: percent_95_100
path: dsir/percent_95_100-*
- config_name: dsir-domain
data_files:
- split: train
path: dsir-domain/train-*
- config_name: ppl_ccnet
data_files:
- split: train
path: ppl_ccnet/train-*
---
|
tyang816/cath | ---
license: apache-2.0
---
|
wenge-research/yayi2_pretrain_data | ---
license: apache-2.0
language:
- zh
- en
size_categories:
- 100B<n<1T
---
## 介绍/Introduction
本数据集源自雅意训练语料,我们精选了约100B数据,数据大小约为500GB。我们期望通过雅意预训练数据的开源推动中文预训练大模型开源社区的发展,并积极为此贡献力量。通过开源,我们与每一位合作伙伴共同构建雅意大模型生态。
We opensource the pre-trained dataset in this release, it should contain more than 100B tokens depending on the tokenizer you use, requiring more than 500GB of local storage. By open-sourcing the pre-trained dataset, we aim to contribute to the development of the Chinese pre-trained large language model open-source community. Through open-source, we aspire to collaborate with every partner in building the YAYI large language model ecosystem.
## 组成
* 在预训练阶段,我们不仅使用了互联网数据来训练模型的语言能力,还添加了通用精选数据和领域数据,以增强模型的专业技能。通用精选数据包含人工收集和整理的高质量数据。涵盖了报纸类数据、文献类数据、APP类数据、代码类数据、书籍类数据、百科类数据。其中,报纸类数据包括广泛的新闻报道和专栏文章,这类数据通常结构化程度高,信息量丰富。文献类数据包括学术论文和研究报告,为我们的数据集注入了专业和深度。代码类数据包括各种编程语言的源码,有助于构建和优化技术类数据的处理模型。书籍类数据涵盖了小说、诗歌、古文、教材等内容,提供丰富的语境和词汇,增强语言模型的理解能力。数据分布情况如下:
* During the pre-training phase, we not only utilized internet data to train the model's language abilities but also incorporated curated general data and domain-specific information to enhance the model's expertise. Curated general data covers a wide range of categories including books (e.g., textbooks, novels), codes, encyclopedias, forums, academic papers, authoritative news, laws and regulations. Details of the data distribution are as follows:

## 数据清洗
- 我们构建了一套全方位提升数据质量的数据处理流水线,包括标准化、启发式清洗、多级去重、毒性过滤四个模块。我们共收集了 240TB 原始数据,预处理后仅剩 10.6TB 高质量数据。整体流程如下:
- We establish a comprehensive data processing pipeline to enhance data quality in all aspects. This pipeline comprises four modules: normalizing, heuristic cleaning, multi-level deduplication, and toxicity filtering. 240 terabytes of raw data are collected for pre-training, and only 10.6 terabytes of high-quality data remain after preprocessing. Details of the data processing pipeline are as follows:

## 协议/License
本项目中的代码依照 [Apache-2.0](https://github.com/wenge-research/YAYI2/blob/main/LICENSE) 协议开源,社区使用 YAYI 2 模型和数据需要遵循[雅意YAYI 2 模型社区许可协议](https://github.com/wenge-research/YAYI2/blob/main/COMMUNITY_LICENSE)。若您需要将雅意 YAYI 2系列模型或其衍生品用作商业用途,请根据[《雅意 YAYI 2 模型商用许可协议》](https://github.com/wenge-research/YAYI2/blob/main/COMMERCIAL_LICENSE)将商用许可申请登记信息发送至指定邮箱 [yayi@wenge.com](mailto:yayi@wenge.com)。审核通过后,雅意将授予您商用版权许可,请遵循协议中的商业许可限制。
The code in this project is open-sourced under the [Apache-2.0](https://github.com/wenge-research/YAYI2/blob/main/LICENSE) license. The use of YaYi series model weights and data must adhere to the [YAYI 2 Community License](https://github.com/wenge-research/YAYI2/blob/main/COMMUNITY_LICENSE). If you intend to use the YAYI 2 series models or their derivatives for commercial purposes, please submit your commercial license application and registration information to [yayi@wenge.com](mailto:yayi@wenge.com), following the [YAYI 2 Commercial License](https://github.com/wenge-research/YAYI2/blob/main/COMMERCIAL_LICENSE). Upon approval, YAYI will grant you a commercial copyright license, subject to the commercial license restrictions outlined in the agreement.
## 引用/Citation
如果您在工作中使用了我们的模型或者数据,请引用我们的论文。
If you are using the resource for your work, please cite our paper.
```
@article{YAYI 2,
author = {Yin Luo, Qingchao Kong, Nan Xu, et.al.},
title = {YAYI 2: Multilingual Open Source Large Language Models},
journal = {arXiv preprint arXiv:2312.14862},
url = {https://arxiv.org/abs/2312.14862},
year = {2023}
}
``` |
mxronga/edeyoruba | ---
license: apache-2.0
language:
- yo
tags:
- pretrain
--- |
heliosprime/twitter_dataset_1712999470 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9232
num_examples: 20
download_size: 8931
dataset_size: 9232
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712999470"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GeneralRincewind/ThumbnailTrendingDataset | ---
dataset_info:
features:
- name: video_id
dtype: string
- name: title
dtype: string
- name: publishedAt
dtype: string
- name: channelId
dtype: string
- name: channelTitle
dtype: string
- name: categoryId
dtype: int64
- name: trending_date
dtype: string
- name: tags
dtype: string
- name: view_count
dtype: int64
- name: likes
dtype: int64
- name: dislikes
dtype: int64
- name: comment_count
dtype: int64
- name: thumbnail_link
dtype: string
- name: comments_disabled
dtype: bool
- name: ratings_disabled
dtype: bool
- name: description
dtype: string
- name: HDThumbnail
dtype: string
- name: __index_level_0__
dtype: int64
- name: image
dtype: image
splits:
- name: train
num_bytes: 1400574193.7436364
num_examples: 52660
download_size: 1419220915
dataset_size: 1400574193.7436364
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BigTMiami/small_amazon_2_500_condensed | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2887244
num_examples: 433
- name: validation
num_bytes: 2693872
num_examples: 404
download_size: 1892480
dataset_size: 5581116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
KiramekiSunnyPro/tokofokawa | ---
license: openrail
---
|
alexgoodell/synthetic-patients | ---
license: cc-by-nc-sa-4.0
---
|
CJWeiss/govreport_id_rename | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 804293268
num_examples: 14597
- name: test
num_bytes: 149069637
num_examples: 2919
- name: valid
num_bytes: 107525366
num_examples: 1947
download_size: 506718966
dataset_size: 1060888271
---
# Dataset Card for "govreport_id_rename"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Abinashkumar/llama2 | ---
license: apache-2.0
---
|
ArmandoReyesMx49/data-set_demo | ---
license: mit
---
|
Troffix/test | ---
license: mit
---
|
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_128k_b | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_128k_b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_AI_128k_b](https://huggingface.co/LeroyDyer/Mixtral_AI_128k_b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_128k_b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T00:29:42.883802](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_128k_b/blob/main/results_2024-03-22T00-29-42.883802.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6370404140176479,\n\
\ \"acc_stderr\": 0.03244713202454115,\n \"acc_norm\": 0.6411769374551979,\n\
\ \"acc_norm_stderr\": 0.03309762374460682,\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5708502006690636,\n\
\ \"mc2_stderr\": 0.015277542354002341\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467321,\n\
\ \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839162\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n\
\ \"acc_stderr\": 0.004714386376337134,\n \"acc_norm\": 0.8468432583150767,\n\
\ \"acc_norm_stderr\": 0.003594024993230561\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926603,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.04161808503501531,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.04161808503501531\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082394,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082394\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.024405173935783234,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.024405173935783234\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826514,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826514\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949829,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949829\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797164,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797164\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675592,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675592\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5708502006690636,\n\
\ \"mc2_stderr\": 0.015277542354002341\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4564063684609553,\n \
\ \"acc_stderr\": 0.013720038270485327\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_128k_b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-29-42.883802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-29-42.883802.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- '**/details_harness|winogrande|5_2024-03-22T00-29-42.883802.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T00-29-42.883802.parquet'
- config_name: results
data_files:
- split: 2024_03_22T00_29_42.883802
path:
- results_2024-03-22T00-29-42.883802.parquet
- split: latest
path:
- results_2024-03-22T00-29-42.883802.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_128k_b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_128k_b](https://huggingface.co/LeroyDyer/Mixtral_AI_128k_b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_128k_b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T00:29:42.883802](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_128k_b/blob/main/results_2024-03-22T00-29-42.883802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6370404140176479,
"acc_stderr": 0.03244713202454115,
"acc_norm": 0.6411769374551979,
"acc_norm_stderr": 0.03309762374460682,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.5708502006690636,
"mc2_stderr": 0.015277542354002341
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467321,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839162
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337134,
"acc_norm": 0.8468432583150767,
"acc_norm_stderr": 0.003594024993230561
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926603,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.04161808503501531,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.04161808503501531
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200144,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082394,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082394
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.024405173935783234,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.024405173935783234
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959402,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826514,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826514
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949829,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949829
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797164,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797164
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675592,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.017151605555749138,
"mc2": 0.5708502006690636,
"mc2_stderr": 0.015277542354002341
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.4564063684609553,
"acc_stderr": 0.013720038270485327
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/utena_hiiragi_mahoushoujoniakogarete | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Utena Hiiragi/柊うてな (Mahou Shoujo ni Akogarete)
This is the dataset of Utena Hiiragi/柊うてな (Mahou Shoujo ni Akogarete), containing 304 images and their tags.
The core tags of this character are `short_hair, black_hair, ahoge, yellow_eyes, horns, purple_hair, magical_girl, yellow_horns, breasts, wings, symbol-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 304 | 193.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/utena_hiiragi_mahoushoujoniakogarete/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 304 | 193.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/utena_hiiragi_mahoushoujoniakogarete/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 601 | 337.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/utena_hiiragi_mahoushoujoniakogarete/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/utena_hiiragi_mahoushoujoniakogarete',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, green_sailor_collar, serafuku, solo, yellow_neckerchief, open_mouth, blush, upper_body, white_shirt |
| 1 | 19 |  |  |  |  |  | 1girl, solo, blush, hair_between_eyes, looking_at_viewer, portrait, close-up, cross-shaped_pupils, open_mouth, facial_mark, sweatdrop |
| 2 | 11 |  |  |  |  |  | 1girl, blush, solo, hair_between_eyes, outdoors, day, open_mouth, blue_sky, cross-shaped_pupils, fang, looking_at_viewer, cloud, portrait, smile, sweatdrop, star_(symbol) |
| 3 | 16 |  |  |  |  |  | 1girl, breastless_clothes, shrug_(clothing), small_breasts, solo, star_pasties, cross_pasties, upper_body, demon_wings, open_mouth, corset, black_nails, blush, facial_mark, nail_polish, star_(symbol), cross-shaped_pupils, blue_sky, outdoors, day, fang, looking_at_viewer, smile |
| 4 | 15 |  |  |  |  |  | 1girl, breastless_clothes, corset, demon_wings, lowleg_pants, navel, showgirl_skirt, shrug_(clothing), low_wings, cross_pasties, cross-shaped_pupils, solo, star_pasties, medium_breasts, revealing_clothes, small_breasts |
| 5 | 7 |  |  |  |  |  | 1girl, open_mouth, solo, indoors, pajamas, shirt, bed, blanket, messy_hair, long_sleeves, under_covers, blush, collarbone, looking_at_viewer, sweatdrop |
| 6 | 7 |  |  |  |  |  | 1girl, open_mouth, solo, sweatdrop, white_shirt, long_sleeves, blush, purple_skirt, from_side |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_sailor_collar | serafuku | solo | yellow_neckerchief | open_mouth | blush | upper_body | white_shirt | hair_between_eyes | looking_at_viewer | portrait | close-up | cross-shaped_pupils | facial_mark | sweatdrop | outdoors | day | blue_sky | fang | cloud | smile | star_(symbol) | breastless_clothes | shrug_(clothing) | small_breasts | star_pasties | cross_pasties | demon_wings | corset | black_nails | nail_polish | lowleg_pants | navel | showgirl_skirt | low_wings | medium_breasts | revealing_clothes | indoors | pajamas | shirt | bed | blanket | messy_hair | long_sleeves | under_covers | collarbone | purple_skirt | from_side |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------------|:-----------|:-------|:---------------------|:-------------|:--------|:-------------|:--------------|:--------------------|:--------------------|:-----------|:-----------|:----------------------|:--------------|:------------|:-----------|:------|:-----------|:-------|:--------|:--------|:----------------|:---------------------|:-------------------|:----------------|:---------------|:----------------|:--------------|:---------|:--------------|:--------------|:---------------|:--------|:-----------------|:------------|:-----------------|:--------------------|:----------|:----------|:--------|:------|:----------|:-------------|:---------------|:---------------|:-------------|:---------------|:------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | | | X | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | X | | X | X | | | X | X | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | | | X | | X | X | X | | | X | | | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | | | X | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | X | | X | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | |
| 6 | 7 |  |  |  |  |  | X | | | X | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X |
|
distilled-from-one-sec-cv12/chunk_86 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1388082832
num_examples: 270476
download_size: 1417945256
dataset_size: 1388082832
---
# Dataset Card for "chunk_86"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rajarshi21/KanjiSD | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6274302.35
num_examples: 2025
download_size: 6444167
dataset_size: 6274302.35
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Divyaamith/resume-dataset | ---
license: mit
task_categories:
- text-classification
language:
- en
--- |
kristinashemet/test_23.03 | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 48608
num_examples: 32
download_size: 19856
dataset_size: 48608
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
llm-lens/lens_vqa_sample_test | ---
dataset_info:
features:
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: id_image
dtype: int64
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
- name: intensive_captions_Salesforce-blip-image-captioning-large
sequence: string
splits:
- name: test
num_bytes: 1601792.0
num_examples: 10
download_size: 1595850
dataset_size: 1601792.0
---
# Dataset Card for "lens_vqa_sample_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mizuki_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mizuki (Pokémon)
This is the dataset of mizuki (Pokémon), containing 500 images and their tags.
The core tags of this character are `black_hair, short_hair, bangs, hat, red_headwear, eyelashes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 459.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 295.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1067 | 590.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 421.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1067 | 802.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mizuki_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, beanie, floral_print, green_shorts, short_sleeves, tied_shirt, :d, bag, blush, open_mouth, t-shirt, yellow_shirt, bracelet, z-ring, tongue, poke_ball_(basic), upper_teeth_only, short_shorts, holding_poke_ball, pokemon_(creature), solo, blue_eyes, simple_background, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, beanie, blue_eyes, short_sleeves, simple_background, solo, white_background, yellow_shirt, blush, looking_at_viewer, tied_shirt, white_pupils, green_shorts, holding_poke_ball, poke_ball_(basic), open_mouth, smile, bag, closed_mouth, floral_print, upper_body |
| 2 | 9 |  |  |  |  |  | 1girl, beanie, closed_mouth, green_shorts, short_sleeves, solo, yellow_shirt, floral_print, simple_background, grey_eyes, looking_at_viewer, short_shorts, tied_shirt, white_background, smile, blush, shoes, sitting |
| 3 | 11 |  |  |  |  |  | blush, large_breasts, nipples, black_eyes, heart-shaped_pupils, sweat, nude, open_mouth, 1girl, smile, 2girls, collarbone, grabbing, yuri, blonde_hair, breast_grab, hetero |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | beanie | floral_print | green_shorts | short_sleeves | tied_shirt | :d | bag | blush | open_mouth | t-shirt | yellow_shirt | bracelet | z-ring | tongue | poke_ball_(basic) | upper_teeth_only | short_shorts | holding_poke_ball | pokemon_(creature) | solo | blue_eyes | simple_background | white_background | looking_at_viewer | white_pupils | smile | closed_mouth | upper_body | grey_eyes | shoes | sitting | large_breasts | nipples | black_eyes | heart-shaped_pupils | sweat | nude | 2girls | collarbone | grabbing | yuri | blonde_hair | breast_grab | hetero |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------------|:---------------|:----------------|:-------------|:-----|:------|:--------|:-------------|:----------|:---------------|:-----------|:---------|:---------|:--------------------|:-------------------|:---------------|:--------------------|:---------------------|:-------|:------------|:--------------------|:-------------------|:--------------------|:---------------|:--------|:---------------|:-------------|:------------|:--------|:----------|:----------------|:----------|:-------------|:----------------------|:--------|:-------|:---------|:-------------|:-----------|:-------|:--------------|:--------------|:---------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | X | | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | X | | | | | | X | | | X | | X | X | X | | X | X | | X | X | X | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
DimitrisMantas/PLASTIC | ---
license: cc-by-4.0
---
|
KursKumpel/FHDW | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1008494
num_examples: 1000
download_size: 346107
dataset_size: 1008494
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thangvip/thuvienphapluat-question-query | ---
dataset_info:
features:
- name: title
dtype: string
- name: question
dtype: string
- name: content
dtype: string
- name: queries
dtype: string
splits:
- name: train
num_bytes: 74387263.28265
num_examples: 19861
download_size: 22410062
dataset_size: 74387263.28265
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
veroinesc/test | ---
license: unknown
---
|
liuyanchen1015/MULTI_VALUE_sst2_preposition_chopping | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 924
num_examples: 9
- name: test
num_bytes: 2307
num_examples: 17
- name: train
num_bytes: 62708
num_examples: 641
download_size: 31247
dataset_size: 65939
---
# Dataset Card for "MULTI_VALUE_sst2_preposition_chopping"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yongchoooon/fire-aihub-chatgpt | ---
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
pretty_name: fire-aihub-chatgpt
size_categories:
- n<1K
tags: []
task_categories:
- text-to-image
task_ids: []
--- |
heliosprime/twitter_dataset_1712997731 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10475
num_examples: 23
download_size: 8402
dataset_size: 10475
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712997731"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__feed-sen_en-2f01d7-2175769990 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-2.7b
metrics: []
dataset_name: futin/feed
dataset_config: sen_en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-2.7b
* Dataset: futin/feed
* Config: sen_en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
CyberHarem/asuka_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of asuka/飛鳥/飞鸟 (Azur Lane)
This is the dataset of asuka/飛鳥/飞鸟 (Azur Lane), containing 368 images and their tags.
The core tags of this character are `breasts, ponytail, brown_eyes, ribbon, large_breasts, hair_ribbon, black_hair, brown_hair, white_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 368 | 446.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 368 | 270.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 875 | 566.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 368 | 396.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 875 | 781.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/asuka_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, open_mouth, :d, blush, striped_bikini, navel, red_scarf, simple_background, white_background |
| 1 | 10 |  |  |  |  |  | 1girl, navel, solo, striped_bikini, cleavage, front-tie_top, looking_at_viewer, blush, side-tie_bikini_bottom, multicolored_stripes, open_mouth, red_scarf, white_background, smile, multicolored_clothes, simple_background |
| 2 | 9 |  |  |  |  |  | cleavage, looking_at_viewer, 1girl, cloud, day, open_mouth, outdoors, solo, blue_sky, beach, navel, side-tie_bikini_bottom, smile, ocean, striped_bikini, blush, long_hair |
| 3 | 30 |  |  |  |  |  | school_uniform, 1girl, solo, sweater_vest, black_thighhighs, dual_wielding, plaid_skirt, red_scarf, katana, necktie, smile, looking_at_viewer, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | cleavage | open_mouth | :d | blush | striped_bikini | navel | red_scarf | simple_background | white_background | front-tie_top | side-tie_bikini_bottom | multicolored_stripes | smile | multicolored_clothes | cloud | day | outdoors | blue_sky | beach | ocean | long_hair | school_uniform | sweater_vest | black_thighhighs | dual_wielding | plaid_skirt | katana | necktie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:-------------|:-----|:--------|:-----------------|:--------|:------------|:--------------------|:-------------------|:----------------|:-------------------------|:-----------------------|:--------|:-----------------------|:--------|:------|:-----------|:-----------|:--------|:--------|:------------|:-----------------|:---------------|:-------------------|:----------------|:--------------|:---------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | | | X | | X | | X | X | X | X | X | X | X | | | | | | | |
| 3 | 30 |  |  |  |  |  | X | X | X | | | | X | | | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | X |
|
Telugu-LLM-Labs/uonlp_culturaX_telugu_romanized_100k | ---
license: mit
---
|
ltg/nb-samtale-conversations | ---
license: cc0-1.0
task_categories:
- conversational
language:
- 'no'
- nb
- nn
pretty_name: NB Samtale — Conversations
size_categories:
- 1K<n<10K
---
# NB Samtale — Conversations
This dataset contains extracted and cleaned conversations from the [NB Samtale corpus](https://www.nb.no/sprakbanken/en/resource-catalogue/oai-nb-no-sbr-85/). The original is a speech corpus made by the Language Bank at the National Library of Norway. The corpus contains orthographically transcribed speech from podcasts and recordings of live events.
|
zxzl/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Beyonce
'1': Jennie_of_BlackPink
'2': Martin_Luther_King_Jr.
'3': Matt_Damon
'4': Miranda_Kerr
'5': RM_of_BTS
splits:
- name: train
num_bytes: 1207553.0
num_examples: 18
download_size: 1206043
dataset_size: 1207553.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Odunope/testsets | ---
dataset_info:
features:
- name: row
dtype: string
splits:
- name: train
num_bytes: 18541.6
num_examples: 8
- name: test
num_bytes: 4635.4
num_examples: 2
download_size: 36285
dataset_size: 23177.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
NarouMas/VariantCommand | ---
license: mit
---
|
TrevorJS/mtg-rules-dataset | ---
dataset_info:
features:
- name: number
dtype: string
- name: text
dtype: string
- name: examples
sequence: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 845258
num_examples: 2944
download_size: 372002
dataset_size: 845258
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DaisyStar004/Transformed_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 385155
num_examples: 607
download_size: 211261
dataset_size: 385155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Transformed_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_macroeconomics-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 176962
num_examples: 390
download_size: 82695
dataset_size: 176962
---
# Dataset Card for "mmlu-high_school_macroeconomics-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lmsys__vicuna-13b-v1.5 | ---
pretty_name: Evaluation run of lmsys/vicuna-13b-v1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lmsys/vicuna-13b-v1.5](https://huggingface.co/lmsys/vicuna-13b-v1.5) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lmsys__vicuna-13b-v1.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T01:22:33.237446](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.5/blob/main/results_2023-10-15T01-22-33.237446.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.21403104026845637,\n\
\ \"em_stderr\": 0.004200304057589016,\n \"f1\": 0.2773447986577177,\n\
\ \"f1_stderr\": 0.004194161726605588,\n \"acc\": 0.4298049932592257,\n\
\ \"acc_stderr\": 0.010471546731533343\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.21403104026845637,\n \"em_stderr\": 0.004200304057589016,\n\
\ \"f1\": 0.2773447986577177,\n \"f1_stderr\": 0.004194161726605588\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.008719339028833057\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.01222375443423363\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lmsys/vicuna-13b-v1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T01_22_33.237446
path:
- '**/details_harness|drop|3_2023-10-15T01-22-33.237446.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T01-22-33.237446.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T01_22_33.237446
path:
- '**/details_harness|gsm8k|5_2023-10-15T01-22-33.237446.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T01-22-33.237446.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:27.985087.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:24:27.985087.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T10:24:27.985087.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T01_22_33.237446
path:
- '**/details_harness|winogrande|5_2023-10-15T01-22-33.237446.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T01-22-33.237446.parquet'
- config_name: results
data_files:
- split: 2023_08_09T10_24_27.985087
path:
- results_2023-08-09T10:24:27.985087.parquet
- split: 2023_10_15T01_22_33.237446
path:
- results_2023-10-15T01-22-33.237446.parquet
- split: latest
path:
- results_2023-10-15T01-22-33.237446.parquet
---
# Dataset Card for Evaluation run of lmsys/vicuna-13b-v1.5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lmsys/vicuna-13b-v1.5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lmsys/vicuna-13b-v1.5](https://huggingface.co/lmsys/vicuna-13b-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lmsys__vicuna-13b-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T01:22:33.237446](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.5/blob/main/results_2023-10-15T01-22-33.237446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.21403104026845637,
"em_stderr": 0.004200304057589016,
"f1": 0.2773447986577177,
"f1_stderr": 0.004194161726605588,
"acc": 0.4298049932592257,
"acc_stderr": 0.010471546731533343
},
"harness|drop|3": {
"em": 0.21403104026845637,
"em_stderr": 0.004200304057589016,
"f1": 0.2773447986577177,
"f1_stderr": 0.004194161726605588
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833057
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.01222375443423363
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
316usman/thematic4d-pw-embed-part1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 392006393
num_examples: 616322
download_size: 152171067
dataset_size: 392006393
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anwarshome/RT_temp | ---
dataset_info:
features:
- name: uuid
dtype: string
- name: sentence
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 58112550.0
num_examples: 686
- name: test
num_bytes: 740753.0
num_examples: 10
download_size: 45644402
dataset_size: 58853303.0
---
# Dataset Card for "RT_temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jjz5463/probing_dataset_5.0 | ---
size_categories:
- n<1K
dataset_info:
features:
- name: attributes
struct:
- name: length
dtype: string
- name: point_of_view
dtype: string
- name: sentence_type
dtype: string
- name: tense
dtype: string
- name: topic
dtype: string
- name: voice
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
- name: feature
dtype: string
splits:
- name: train
num_bytes: 122900
num_examples: 400
download_size: 49949
dataset_size: 122900
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
library_name: datadreamer
tags:
- datadreamer
- datadreamer-0.25.0
- synthetic
- gpt-4
---
# Dataset Card
[Add more information here](https://huggingface.co/datasets/templates/dataset-card-example)
---
This dataset was produced with [DataDreamer 🤖💤](https://datadreamer.dev). The synthetic dataset card can be found [here](datadreamer.json). |
fedora-copr/pep-sum | ---
language:
- en
multilinguality:
- monolingual
size_categories:
- n<1K
task_categories:
- summarization
- text-classification
dataset_info:
features:
- name: text
dtype: string
- name: status
dtype: string
- name: title
dtype: string
- name: type
dtype: string
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 4816611
num_examples: 345
download_size: 2525116
dataset_size: 4816611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deadbits/vigil-instruction-bypass-all-MiniLM-L6-v2 | ---
tags:
- embeddings
- text
- security
pretty_name: 'Vigil: LLM Instruction Bypass all-MiniLM-L6-v2 '
---
# Vigil: LLM Instruction Bypass all-MiniLM-L6-v2
- **Repo:** [github.com/deadbits/vigil-llm](https://github.com/deadbits/vigil-llm)
`Vigil` is a Python framework and REST API for assessing Large Language Model (LLM) prompts against a set of scanners to detect prompt injections, jailbreaks, and other potentially risky inputs.
This repository contains `all-MiniLM-L6-v2` embeddings for all Instruction Bypass style prompts ("Ignore instructions ...") used by [Vigil](https://github.com/deadbits/prompt-injection-defense).
You can use the [parquet2vdb.py](https://github.com/deadbits/prompt-injection-defense/blob/main/vigil/utils/parquet2vdb.py) utility to load the embeddings in the Vigil chromadb instance, or use them in your own application.
## Format
```json
[
{
"text": str,
"embedding": [],
"model": "all-MiniLM-L6-v2"
}
]
```
Instruction bypass prompts generated with: https://gist.github.com/deadbits/e93a90aa36c9aa7b5ce1179597a6fe3d#file-generate-phrases-py |
autoevaluate/autoeval-staging-eval-project-squad_v2-82949658-14045923 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: Aiyshwariya/bert-finetuned-squad
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Aiyshwariya/bert-finetuned-squad
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B | ---
pretty_name: Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T09:58:38.972064](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B/blob/main/results_2023-09-12T09-58-38.972064.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4674367304116677,\n\
\ \"acc_stderr\": 0.035284344124032196,\n \"acc_norm\": 0.4714260290393888,\n\
\ \"acc_norm_stderr\": 0.03526985338617593,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237014,\n \"mc2\": 0.3961362396399567,\n\
\ \"mc2_stderr\": 0.013785031017759436\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4948805460750853,\n \"acc_stderr\": 0.014610624890309157,\n\
\ \"acc_norm\": 0.5332764505119454,\n \"acc_norm_stderr\": 0.014578995859605802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5902210714997013,\n\
\ \"acc_stderr\": 0.004907877144720015,\n \"acc_norm\": 0.7871937860983867,\n\
\ \"acc_norm_stderr\": 0.004084552641903664\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.030533338430467516,\n\
\ \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.030533338430467516\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633363,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633363\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398393,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398393\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104283,\n\
\ \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104283\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6330275229357798,\n \"acc_stderr\": 0.020664675659520525,\n \"\
acc_norm\": 0.6330275229357798,\n \"acc_norm_stderr\": 0.020664675659520525\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012383,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012383\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"\
acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n\
\ \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.03023638994217308,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.03023638994217308\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6462324393358876,\n\
\ \"acc_stderr\": 0.017098184708161903,\n \"acc_norm\": 0.6462324393358876,\n\
\ \"acc_norm_stderr\": 0.017098184708161903\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089775,\n\
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n\
\ \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n\
\ \"acc_stderr\": 0.012286991879902884,\n \"acc_norm\": 0.363754889178618,\n\
\ \"acc_norm_stderr\": 0.012286991879902884\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.03030625772246832,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.03030625772246832\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46078431372549017,\n \"acc_stderr\": 0.020165523313907904,\n \
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.020165523313907904\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237014,\n \"mc2\": 0.3961362396399567,\n\
\ \"mc2_stderr\": 0.013785031017759436\n }\n}\n```"
repo_url: https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-58-38.972064.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T09-58-38.972064.parquet'
- config_name: results
data_files:
- split: 2023_09_12T09_58_38.972064
path:
- results_2023-09-12T09-58-38.972064.parquet
- split: latest
path:
- results_2023-09-12T09-58-38.972064.parquet
---
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T09:58:38.972064](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B/blob/main/results_2023-09-12T09-58-38.972064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4674367304116677,
"acc_stderr": 0.035284344124032196,
"acc_norm": 0.4714260290393888,
"acc_norm_stderr": 0.03526985338617593,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237014,
"mc2": 0.3961362396399567,
"mc2_stderr": 0.013785031017759436
},
"harness|arc:challenge|25": {
"acc": 0.4948805460750853,
"acc_stderr": 0.014610624890309157,
"acc_norm": 0.5332764505119454,
"acc_norm_stderr": 0.014578995859605802
},
"harness|hellaswag|10": {
"acc": 0.5902210714997013,
"acc_stderr": 0.004907877144720015,
"acc_norm": 0.7871937860983867,
"acc_norm_stderr": 0.004084552641903664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.030533338430467516,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.030533338430467516
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633363,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633363
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.03423465100104283,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.03423465100104283
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6330275229357798,
"acc_stderr": 0.020664675659520525,
"acc_norm": 0.6330275229357798,
"acc_norm_stderr": 0.020664675659520525
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012383,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012383
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5214723926380368,
"acc_stderr": 0.03924746876751129,
"acc_norm": 0.5214723926380368,
"acc_norm_stderr": 0.03924746876751129
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.03023638994217308,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.03023638994217308
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6462324393358876,
"acc_stderr": 0.017098184708161903,
"acc_norm": 0.6462324393358876,
"acc_norm_stderr": 0.017098184708161903
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089775,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.363754889178618,
"acc_stderr": 0.012286991879902884,
"acc_norm": 0.363754889178618,
"acc_norm_stderr": 0.012286991879902884
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.03030625772246832,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.03030625772246832
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.020165523313907904,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.020165523313907904
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237014,
"mc2": 0.3961362396399567,
"mc2_stderr": 0.013785031017759436
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
xNoper/dubai-aerial | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 33524376.0
num_examples: 72
download_size: 32535970
dataset_size: 33524376.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
robinhad/databricks-dolly-15k-uk | ---
license: cc-by-sa-3.0
task_categories:
- question-answering
- summarization
language:
- uk
size_categories:
- 10K<n<100K
---
# Summary
`databricks-dolly-15k-uk` is an open source dataset based on [databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) instruction-following dataset, but machine translated using [facebook/m2m100_1.2B](https://huggingface.co/facebook/m2m100_1.2B) model.
Tasks covered include brainstorming, classification, closed QA, generation, information extraction, open QA, and summarization.
Expect this dataset to not be grammatically correct and having obvious pitfalls of machine translation.
<details>
<summary>Original Summary</summary>
# Summary
`databricks-dolly-15k` is an open source dataset of instruction-following records generated by thousands of Databricks employees in several
of the behavioral categories outlined in the [InstructGPT](https://arxiv.org/abs/2203.02155) paper, including brainstorming, classification,
closed QA, generation, information extraction, open QA, and summarization.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode).
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Ukrainian
Version: 1.0
**Owner: Databricks, Inc.**
# Dataset Overview
`databricks-dolly-15k` is a corpus of more than 15,000 records generated by thousands of Databricks employees to enable large language
models to exhibit the magical interactivity of ChatGPT.
Databricks employees were invited to create prompt / response pairs in each of eight different instruction categories, including
the seven outlined in the InstructGPT paper, as well as an open-ended free-form category. The contributors were instructed to avoid using
information from any source on the web with the exception of Wikipedia (for particular subsets of instruction categories), and explicitly
instructed to avoid using generative AI in formulating instructions or responses. Examples of each behavior were provided to motivate the
types of questions and instructions appropriate to each category.
Halfway through the data generation process, contributors were given the option of answering questions posed by other contributors.
They were asked to rephrase the original question and only select questions they could be reasonably expected to answer correctly.
For certain categories contributors were asked to provide reference texts copied from Wikipedia. Reference text (indicated by the `context`
field in the actual dataset) may contain bracketed Wikipedia citation numbers (e.g. `[42]`) which we recommend users remove for downstream applications.
# Intended Uses
While immediately valuable for instruction fine tuning large language models, as a corpus of human-generated instruction prompts,
this dataset also presents a valuable opportunity for synthetic data generation in the methods outlined in the Self-Instruct paper.
For example, contributor--generated prompts could be submitted as few-shot examples to a large open language model to generate a
corpus of millions of examples of instructions in each of the respective InstructGPT categories.
Likewise, both the instructions and responses present fertile ground for data augmentation. A paraphrasing model might be used to
restate each prompt or short responses, with the resulting text associated to the respective ground-truth sample. Such an approach might
provide a form of regularization on the dataset that could allow for more robust instruction-following behavior in models derived from
these synthetic datasets.
# Dataset
## Purpose of Collection
As part of our continuing commitment to open source, Databricks developed what is, to the best of our knowledge, the first open source,
human-generated instruction corpus specifically designed to enable large language models to exhibit the magical interactivity of ChatGPT.
Unlike other datasets that are limited to non-commercial use, this dataset can be used, modified, and extended for any purpose, including
academic or commercial applications.
## Sources
- **Human-generated data**: Databricks employees were invited to create prompt / response pairs in each of eight different instruction categories.
- **Wikipedia**: For instruction categories that require an annotator to consult a reference text (information extraction, closed QA, summarization)
contributors selected passages from Wikipedia for particular subsets of instruction categories. No guidance was given to annotators as to how to select the
target passages.
## Annotator Guidelines
To create a record, employees were given a brief description of the annotation task as well as examples of the types of prompts typical
of each annotation task. Guidelines were succinct by design so as to encourage a high task completion rate, possibly at the cost of
rigorous compliance to an annotation rubric that concretely and reliably operationalizes the specific task. Caveat emptor.
The annotation guidelines for each of the categories are as follows:
- **Creative Writing**: Write a question or instruction that requires a creative, open-ended written response. The instruction should be reasonable to ask of a person with general world knowledge and should not require searching. In this task, your prompt should give very specific instructions to follow. Constraints, instructions, guidelines, or requirements all work, and the more of them the better.
- **Closed QA**: Write a question or instruction that requires factually correct response based on a passage of text from Wikipedia. The question can be complex and can involve human-level reasoning capabilities, but should not require special knowledge. To create a question for this task include both the text of the question as well as the reference text in the form.
- **Open QA**: Write a question that can be answered using general world knowledge or at most a single search. This task asks for opinions and facts about the world at large and does not provide any reference text for consultation.
- **Summarization**: Give a summary of a paragraph from Wikipedia. Please don't ask questions that will require more than 3-5 minutes to answer. To create a question for this task include both the text of the question as well as the reference text in the form.
- **Information Extraction**: These questions involve reading a paragraph from Wikipedia and extracting information from the passage. Everything required to produce an answer (e.g. a list, keywords etc) should be included in the passages. To create a question for this task include both the text of the question as well as the reference text in the form.
- **Classification**: These prompts contain lists or examples of entities to be classified, e.g. movie reviews, products, etc. In this task the text or list of entities under consideration is contained in the prompt (e.g. there is no reference text.). You can choose any categories for classification you like, the more diverse the better.
- **Brainstorming**: Think up lots of examples in response to a question asking to brainstorm ideas.
## Personal or Sensitive Data
This dataset contains public information (e.g., some information from Wikipedia). To our knowledge, there are no private person’s personal identifiers or sensitive information.
## Language
American English
# Known Limitations
- Wikipedia is a crowdsourced corpus and the contents of this dataset may reflect the bias, factual errors and topical focus found in Wikipedia
- Some annotators may not be native English speakers
- Annotator demographics and subject matter may reflect the makeup of Databricks employees
# License/Attribution
**Copyright (2023) Databricks, Inc.**
This dataset was developed at Databricks (https://www.databricks.com) and its use is subject to the CC BY-SA 3.0 license.
Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license:
Wikipedia (various pages) - https://www.wikipedia.org/
Copyright © Wikipedia editors and contributors.
</details> |
xufana/RedPajama-INCITE-Instruct-3B-Addition | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- arithmetics
pretty_name: RedPajama Arithmetics
size_categories:
- 100K<n<1M
---
## Dataset Summary
The Arithmetic Operations Dataset is a synteticly generated collection of mathematical arithmetic operations for practice and evaluation purposes. It contains a total of 624,800 arithmetic operations, consisting of 568,000 addition operations and 56,800 subtraction operations. The dataset is designed to provide a range of arithmetic problems to train and evaluate language models for solving simple arithmetic (mostly addition, the others TBA) problems.
## Dataset Structure
The dataset is organized into two main categories: addition and subtraction. Each category contains a set of arithmetic operations in separate files (`addition.json`) and (`subtraction.json`), and the file (`dataset.json`) provides combined data from both.
### Data Instances
```bash
{
"instruction": "What is the answer to 373486002216116154 + 339369?",
"input": "373486002216116154 + 339369",
"output": "373486002216116154 + 339369 = 373486002216455523",
"answer": "373486002216455523"
},
{
"instruction": "9916607491627649 minus 581954",
"input": "9916607491627649 - 581954",
"output": "9916607491627649 - 581954 = 9916607491045695",
"answer": "9916607491045695"
},
```
### Data Fields
The files share the same structure and have 4 fields:
- `instruction`: Human instructions are generated by inserting arithmetic expressions into randomly selected templates and incorporating natural language variations. These instructions are intended to serve as prompts for instruction-finetuning, providing input for training the model.
- `input`: A randomly generated arithmetic expression, that can serve as a substitute for the 'instruction' component during training, allowing a specific focus on arithmetic operations while minimizing the impact of natural language.
- `output`: the target output for the model to learn.
- `answer`: direct numerical answer to the arithmetic task. It can be used to test learnability of various sub-tasks.
## Contact
For any questions or inquiries regarding this dataset, please contact xufana@yandex.ru.
|
barunsaha/aya_dataset_ben_translated | ---
license: apache-2.0
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: language
dtype: string
- name: language_code
dtype: string
- name: annotation_type
dtype: string
- name: user_id
dtype: string
splits:
- name: train
num_bytes: 11918662
num_examples: 6633
- name: test
num_bytes: 308222
num_examples: 250
download_size: 4492541
dataset_size: 12226884
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- question-answering
language:
- bn
pretty_name: (Subset of) Aya dataset translated to Bengali
size_categories:
- 1K<n<10K
---
`aya_dataset_ben_translated` is a subset of the [aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset), with some modifications. In particular, the original data points in Bengali (indicated by the `language` or `language_code` columns) are retained. In addition, the English and Hindi data points are translated into Bengali using Google Cloud Translation API. All columns from the original dataset are retained.
A handful of inaccuracies arising out of translation have been fixed so far. Therefore, the dataset can be a bit noisy. This is particularly true for coding related questions and answers. Moreover, some non-Bengali characters can be found in the text. In addition, potential duplicates from the original dataset are retained as well. |
Fhrozen/CABankSakura | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- ja
license:
- cc
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- found
task_categories:
- audio-classification
- automatic-speech-recognition
task_ids:
- speaker-identification
pretty_name: banksakura
tags:
- speech-recognition
---
# CABank Japanese Sakura Corpus
- Susanne Miyata
- Department of Medical Sciences
- Aichi Shukotoku University
- smiyata@asu.aasa.ac.jp
- website: https://ca.talkbank.org/access/Sakura.html
## Important
This data set is a copy from the original one located at https://ca.talkbank.org/access/Sakura.html.
## Details
- Participants: 31
- Type of Study: xxx
- Location: Japan
- Media type: audio
- DOI: doi:10.21415/T5M90R
## Citation information
Some citation here.
In accordance with TalkBank rules, any use of data from this corpus must be accompanied by at least one of the above references.
## Project Description
This corpus of 18 conversations is the product of six graduation theses on gender differences in students' group talk. Each conversation lasted between 12 and 35 minutes (avg. 25 minutes) resulting in an overall time of 7 hours and 30 minutes. 31 Students (19 female, 12 male) participated in the study (Table 1). The participants gathered in groups of 4 students, either of the same or the opposite sex (6 conversations with a group of 4 female students, 6 with 4 male students, and 6 conversations with 2 male and 2 female students), according to age (first and third year students) and affiliation (two academic departments). In addition, the participants of each conversation came from the same small-sized class and were well acquainted.
The participants were informed that their conversations may be transcribed and a video recorded for use in possible publication when recruited. Additionally, permission was asked once more after the transcription in cases where either private information had been displayed, or a misunderstanding concerning the nature and degree of the publication of the conversations became apparent during the conversation.
The recordings took place in a small conference room at the university between or after lectures. The participants were given a card with a conversation topic to start with, but were free to vary (topic 1 "What do you expect from an opposite sex friend?" [isee ni motomeru koto]; topic 2 "Are you a dog lover or a cat lover?" [inuha ka nekoha ka]; topic 3 "About part-time work" [arubaito ni tsuite]). The investigator was not present during the recording. The combination of participants, the topic, and the duration of the 18 conversations are given in Table 2.
The participants produced 15,449 utterances overall (female: 8,027 utterances, male: 7,422 utterances). All utterances were linked to video and transcribed in regular Japanese orthography and Latin script (Wakachi2002), and provided with morphological tags (JMOR04.1). Proper names were replaced by pseudonyms.
## Acknowledgements
Additional contributors: Banno, Kyoko; Konishi, Saya; Matsui, Ayumi; Matsumoto, Shiori; Oogi, Rie; Takahashi, Akane; Muraki, Kyoko.
|
SpongeBash/hugging_face | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 75931.0
num_examples: 12
download_size: 77302
dataset_size: 75931.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hugging_face"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
duckaiml/mc4_310 | ---
license: other
dataset_info:
config_name: ko
features:
- name: source
dtype: string
- name: id
dtype: string
- name: text
dtype: string
- name: added
dtype: string
- name: timestamp
dtype: timestamp[s]
- name: metadata
struct:
- name: url
dtype: string
- name: lang
struct:
- name: ko.tfrecord
dtype: float64
splits:
- name: train
num_bytes: 151177516676
num_examples: 24035493
download_size: 16185376673
dataset_size: 151177516676
configs:
- config_name: ko
data_files:
- split: train
path: ko/train-*
---
mc4 but in HPC friendly parquet format (32GiB shards)
Attribution,license, copyright info: [Google](https://www.tensorflow.org/datasets/catalog/c4) and [AI^2](https://huggingface.co/datasets/allenai/c4) for producing and uploading them.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.