datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
GabrielTOP/eu | ---
license: openrail
---
|
typeof/arc | ---
dataset_info:
- config_name: ARC-Challenge
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 702457
num_examples: 1418
download_size: 382245
dataset_size: 702457
- config_name: ARC-Easy
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1220672
num_examples: 2821
download_size: 655830
dataset_size: 1220672
- config_name: default
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
- name: text
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 1957041
num_examples: 4239
download_size: 1038847
dataset_size: 1957041
configs:
- config_name: ARC-Challenge
data_files:
- split: train
path: ARC-Challenge/train-*
- config_name: ARC-Easy
data_files:
- split: train
path: ARC-Easy/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
glitchbench/GlitchBench | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: string
- name: reddit
dtype: string
- name: glitch-type
dtype: string
- name: game
dtype: string
- name: source
dtype: string
- name: description
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: validation
num_bytes: 686309290
num_examples: 607
download_size: 686303027
dataset_size: 686309290
license: mit
task_categories:
- image-to-text
language:
- en
tags:
- Video Game
- Glitch
pretty_name: GlitchBench
size_categories:
- n<1K
---
# GlitchBench
This repository contains the dataset for the paper [`GlitchBench: Can large multimodal models detect video game glitches?`](https://arxiv.org/abs/2312.05291)
<div align="center">
<p > by
<a href="https://taesiri.ai">Mohammad Reza Taesiri</a>,
Tianjun Feng
<a href="https://anhnguyen.me/research/">Anh Nguyen</a>, and
<a href="https://asgaard.ece.ualberta.ca/">Cor-Paul Bezemer</a>
</p>
<p >
(CVPR 2024)
</p>
</div>
## Abstract
Large multimodal models (LMMs) have evolved from large language models (LLMs) to integrate multiple input modalities, such as visual inputs. This integration augments the capacity of LLMs in tasks requiring visual comprehension and reasoning. However, the extent and limitations of their enhanced abilities are not fully understood. To address this gap, we introduce GlitchBench, a novel benchmark designed to test and evaluate the common-sense reasoning and visual recognition capabilities of large multimodal models. Our dataset is curated from a variety of unusual, infrequent, and glitched scenarios from video game content and aims to challenge both the visual and linguistic reasoning powers of LMMs in detecting and interpreting out-of-the-ordinary events and scene composition.
|
open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo | ---
pretty_name: Evaluation run of Xenon1/Zenith-7B-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xenon1/Zenith-7B-dpo](https://huggingface.co/Xenon1/Zenith-7B-dpo) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T00:56:08.608321](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo/blob/main/results_2024-02-15T00-56-08.608321.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6007056516844032,\n\
\ \"acc_stderr\": 0.033168788562989236,\n \"acc_norm\": 0.609221428455361,\n\
\ \"acc_norm_stderr\": 0.03389796776551389,\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.605026177940713,\n\
\ \"mc2_stderr\": 0.01589658250093076\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064664,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513782\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6400119498107947,\n\
\ \"acc_stderr\": 0.004790155370993449,\n \"acc_norm\": 0.829416450906194,\n\
\ \"acc_norm_stderr\": 0.003753759220205055\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671753,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.02493931390694079,\n \
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.02493931390694079\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473065,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473065\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.01510455000890572,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.01510455000890572\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28156424581005585,\n\
\ \"acc_stderr\": 0.015042290171866118,\n \"acc_norm\": 0.28156424581005585,\n\
\ \"acc_norm_stderr\": 0.015042290171866118\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\
\ \"acc_stderr\": 0.012585471793400659,\n \"acc_norm\": 0.4152542372881356,\n\
\ \"acc_norm_stderr\": 0.012585471793400659\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866353,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866353\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505418,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505418\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.605026177940713,\n\
\ \"mc2_stderr\": 0.01589658250093076\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16603487490523122,\n \
\ \"acc_stderr\": 0.010249811990593532\n }\n}\n```"
repo_url: https://huggingface.co/Xenon1/Zenith-7B-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|arc:challenge|25_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|gsm8k|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hellaswag|10_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T00-56-08.608321.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- '**/details_harness|winogrande|5_2024-02-15T00-56-08.608321.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T00-56-08.608321.parquet'
- config_name: results
data_files:
- split: 2024_02_15T00_56_08.608321
path:
- results_2024-02-15T00-56-08.608321.parquet
- split: latest
path:
- results_2024-02-15T00-56-08.608321.parquet
---
# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B-dpo](https://huggingface.co/Xenon1/Zenith-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T00:56:08.608321](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo/blob/main/results_2024-02-15T00-56-08.608321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6007056516844032,
"acc_stderr": 0.033168788562989236,
"acc_norm": 0.609221428455361,
"acc_norm_stderr": 0.03389796776551389,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.605026177940713,
"mc2_stderr": 0.01589658250093076
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064664,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.014258563880513782
},
"harness|hellaswag|10": {
"acc": 0.6400119498107947,
"acc_stderr": 0.004790155370993449,
"acc_norm": 0.829416450906194,
"acc_norm_stderr": 0.003753759220205055
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671753,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.02493931390694079,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.02493931390694079
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473065,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.01510455000890572,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.01510455000890572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28156424581005585,
"acc_stderr": 0.015042290171866118,
"acc_norm": 0.28156424581005585,
"acc_norm_stderr": 0.015042290171866118
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400659,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400659
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866353,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866353
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505418,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505418
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.605026177940713,
"mc2_stderr": 0.01589658250093076
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
},
"harness|gsm8k|5": {
"acc": 0.16603487490523122,
"acc_stderr": 0.010249811990593532
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Aldemar234/FDS02 | ---
license: openrail
---
|
rasdaw/awedrfwae | ---
license: afl-3.0
---
|
patruff/chucklesH1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 155350
num_examples: 1021
- name: test
num_bytes: 39045
num_examples: 256
download_size: 41318
dataset_size: 194395
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
heliosprime/twitter_dataset_1713104385 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 15345
num_examples: 42
download_size: 15997
dataset_size: 15345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713104385"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MAsad789565/OpenChat_Coding | ---
license: apache-2.0
---
|
joey234/mmlu-business_ethics-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 11534
num_examples: 5
- name: test
num_bytes: 1367503
num_examples: 100
download_size: 132815
dataset_size: 1379037
---
# Dataset Card for "mmlu-business_ethics-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
husohome/preference-for-moderator | ---
license: apache-2.0
task_categories:
- text2text-generation
language:
- en
tags:
- not-for-all-audiences
- moderator
- preferences
- DPO
---
# Purpose
This dataset was created as a preference dataset to be used with methods like Direct Preference Optimization. I try to make the foundation model behaves like a moderator who takes care of everyone.
|
CyberHarem/ptrd_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ptrd/PTRD/PTRD (Girls' Frontline)
This is the dataset of ptrd/PTRD/PTRD (Girls' Frontline), containing 40 images and their tags.
The core tags of this character are `breasts, long_hair, blonde_hair, large_breasts, earrings, mole, purple_eyes, mole_under_eye, hat, fur_hat, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 62.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ptrd_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 30.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ptrd_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 100 | 65.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ptrd_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 52.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ptrd_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 100 | 100.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ptrd_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ptrd_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, fishnets, jewelry, looking_at_viewer, solo, fingerless_gloves, underboob, asymmetrical_legwear, black_gloves, blush, pantyhose, single_thighhigh, white_background, ahoge, belt, covered_nipples, papakha |
| 1 | 6 |  |  |  |  |  | 1girl, anti-materiel_rifle, fingerless_gloves, solo, thighhighs, fishnets, full_body, jewelry, looking_at_viewer, belt_boots, black_footwear, black_gloves, underboob |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | fishnets | jewelry | looking_at_viewer | solo | fingerless_gloves | underboob | asymmetrical_legwear | black_gloves | blush | pantyhose | single_thighhigh | white_background | ahoge | belt | covered_nipples | papakha | anti-materiel_rifle | thighhighs | full_body | belt_boots | black_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------|:--------------------|:-------|:--------------------|:------------|:-----------------------|:---------------|:--------|:------------|:-------------------|:-------------------|:--------|:-------|:------------------|:----------|:----------------------|:-------------|:------------|:-------------|:-----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | | | | | | X | X | X | X | X |
|
HuggingFaceM4/dalle3 | Invalid username or password. |
contentlingo/assets_post | ---
license: apache-2.0
---
|
mboth/luftVerteilen-100-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Auslass
'1': Raum
'2': VolumenstromreglerAbluft
'3': VolumenstromreglerRaum
'4': VolumenstromreglerZuluft
- name: Score
dtype: float64
splits:
- name: train
num_bytes: 103270.61044034091
num_examples: 403
- name: test
num_bytes: 91259
num_examples: 352
- name: valid
num_bytes: 91259
num_examples: 352
download_size: 111225
dataset_size: 285788.61044034094
---
# Dataset Card for "luftVerteilen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oblivionchecke/calyfilm-zdarmo | ---
license: openrail
---
|
CyberHarem/chaser_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chaser/チェイサー/追赶者 (Azur Lane)
This is the dataset of chaser/チェイサー/追赶者 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, large_breasts, long_hair, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 13.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 14.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 12.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 20.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chaser_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chaser_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, cleavage, dress, open_mouth, holding, simple_background, white_background, closed_mouth, full_body, long_sleeves, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | cleavage | dress | open_mouth | holding | simple_background | white_background | closed_mouth | full_body | long_sleeves | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:-----------|:--------|:-------------|:----------|:--------------------|:-------------------|:---------------|:------------|:---------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
04RR/tiny-instruct | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 1M<n<10M
pretty_name: tiny-instruct
---
# tiny-instruct-v1
This dataset is collated from multiple other open-source datasets (de-duplicated). This has a total of ~6M rows each with an instruction and response (single-turn converstion).
#### Code Datasets:
1. [CodeAlpaca_20K](https://huggingface.co/datasets/HuggingFaceH4/CodeAlpaca_20K)
2. [CodeExercise-Python-27k](https://huggingface.co/datasets/codefuse-ai/CodeExercise-Python-27k)
3. [Evol-Instruct-Code-80k-v1](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1)
4. [tiny-codes](https://huggingface.co/datasets/nampdn-ai/tiny-codes)
5. [Evol-instruction-66k](https://huggingface.co/datasets/codefuse-ai/Evol-instruction-66k)
6. [sciphi-python-textbook](https://huggingface.co/datasets/emrgnt-cmplxty/sciphi-python-textbook)
7. [programming_books_llama](https://huggingface.co/datasets/open-phi/programming_books_llama)
8. [WizardLM_evol_instruct_70k](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_70k)
#### Math Datasets:
1. [MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA)
2. [arxiv-math-instruct-50k](https://huggingface.co/datasets/ArtifactAI/arxiv-math-instruct-50k)
3. [MathInstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct)
#### General Datasets:
1. [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca)
2. [claude_evol_instruct_210k](https://huggingface.co/datasets/Norquinal/claude_evol_instruct_210k) |
open-llm-leaderboard/details_Sao10K__Test-Raw-Solar-v1 | ---
pretty_name: Evaluation run of Sao10K/Test-Raw-Solar-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Test-Raw-Solar-v1](https://huggingface.co/Sao10K/Test-Raw-Solar-v1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Test-Raw-Solar-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T15:39:57.083985](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Test-Raw-Solar-v1/blob/main/results_2024-02-10T15-39-57.083985.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6549024804764068,\n\
\ \"acc_stderr\": 0.0316286322442768,\n \"acc_norm\": 0.6581057076912448,\n\
\ \"acc_norm_stderr\": 0.032269301284167065,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4898939354128775,\n\
\ \"mc2_stderr\": 0.014672110555240443\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268443,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168477\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6586337382991436,\n\
\ \"acc_stderr\": 0.004731989816563668,\n \"acc_norm\": 0.8482374029077873,\n\
\ \"acc_norm_stderr\": 0.003580573563373659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337128,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337128\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723306,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723306\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941187,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941187\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568617,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798824,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993469,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993469\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331158,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331158\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.0239291555173513,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.0239291555173513\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023132376234543346,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023132376234543346\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n\
\ \"acc_stderr\": 0.012768673076111903,\n \"acc_norm\": 0.4921773142112125,\n\
\ \"acc_norm_stderr\": 0.012768673076111903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377215,\n\
\ \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377215\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6977124183006536,\n \"acc_stderr\": 0.018579232711113884,\n \
\ \"acc_norm\": 0.6977124183006536,\n \"acc_norm_stderr\": 0.018579232711113884\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4898939354128775,\n\
\ \"mc2_stderr\": 0.014672110555240443\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5056861258529188,\n \
\ \"acc_stderr\": 0.013771594106283033\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Test-Raw-Solar-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|arc:challenge|25_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|gsm8k|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hellaswag|10_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-39-57.083985.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T15-39-57.083985.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- '**/details_harness|winogrande|5_2024-02-10T15-39-57.083985.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T15-39-57.083985.parquet'
- config_name: results
data_files:
- split: 2024_02_10T15_39_57.083985
path:
- results_2024-02-10T15-39-57.083985.parquet
- split: latest
path:
- results_2024-02-10T15-39-57.083985.parquet
---
# Dataset Card for Evaluation run of Sao10K/Test-Raw-Solar-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Test-Raw-Solar-v1](https://huggingface.co/Sao10K/Test-Raw-Solar-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Test-Raw-Solar-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T15:39:57.083985](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Test-Raw-Solar-v1/blob/main/results_2024-02-10T15-39-57.083985.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6549024804764068,
"acc_stderr": 0.0316286322442768,
"acc_norm": 0.6581057076912448,
"acc_norm_stderr": 0.032269301284167065,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4898939354128775,
"mc2_stderr": 0.014672110555240443
},
"harness|arc:challenge|25": {
"acc": 0.5964163822525598,
"acc_stderr": 0.014337158914268443,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168477
},
"harness|hellaswag|10": {
"acc": 0.6586337382991436,
"acc_stderr": 0.004731989816563668,
"acc_norm": 0.8482374029077873,
"acc_norm_stderr": 0.003580573563373659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337128,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337128
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723306,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941187,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941187
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568617,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798824,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993469,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993469
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331158,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331158
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.0239291555173513,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.0239291555173513
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023132376234543346,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023132376234543346
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.012768673076111903,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.012768673076111903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377215,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377215
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6977124183006536,
"acc_stderr": 0.018579232711113884,
"acc_norm": 0.6977124183006536,
"acc_norm_stderr": 0.018579232711113884
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4898939354128775,
"mc2_stderr": 0.014672110555240443
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.5056861258529188,
"acc_stderr": 0.013771594106283033
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HydraLM/partitioned_v2_standardized_4 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 65433364.62261582
num_examples: 136373
download_size: 15662586
dataset_size: 65433364.62261582
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PJDan/train | ---
license: unknown
---
|
google/boolq | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- natural-language-inference
paperswithcode_id: boolq
pretty_name: BoolQ
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: bool
- name: passage
dtype: string
splits:
- name: train
num_bytes: 5829584
num_examples: 9427
- name: validation
num_bytes: 1998182
num_examples: 3270
download_size: 4942776
dataset_size: 7827766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for Boolq
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Repository:** https://github.com/google-research-datasets/boolean-questions
- **Paper:** https://arxiv.org/abs/1905.10044
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 8.77 MB
- **Size of the generated dataset:** 7.83 MB
- **Total amount of disk used:** 16.59 MB
### Dataset Summary
BoolQ is a question answering dataset for yes/no questions containing 15942 examples. These questions are naturally
occurring ---they are generated in unprompted and unconstrained settings.
Each example is a triplet of (question, passage, answer), with the title of the page as optional additional context.
The text-pair classification setup is similar to existing natural language inference tasks.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 8.77 MB
- **Size of the generated dataset:** 7.83 MB
- **Total amount of disk used:** 16.59 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"answer": false,
"passage": "\"All biomass goes through at least some of these steps: it needs to be grown, collected, dried, fermented, distilled, and burned...",
"question": "does ethanol take more energy make that produces"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `question`: a `string` feature.
- `answer`: a `bool` feature.
- `passage`: a `string` feature.
### Data Splits
| name |train|validation|
|-------|----:|---------:|
|default| 9427| 3270|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
BoolQ is released under the [Creative Commons Share-Alike 3.0](https://creativecommons.org/licenses/by-sa/3.0/) license.
### Citation Information
```
@inproceedings{clark2019boolq,
title = {BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions},
author = {Clark, Christopher and Lee, Kenton and Chang, Ming-Wei, and Kwiatkowski, Tom and Collins, Michael, and Toutanova, Kristina},
booktitle = {NAACL},
year = {2019},
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@lhoestq](https://github.com/lhoestq), [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@albertvillanova](https://github.com/albertvillanova) for adding this dataset. |
Codec-SUPERB/covost2_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
splits:
- name: original
num_bytes: 4043997731.972
num_examples: 23778
- name: academicodec_hifi_16k_320d
num_bytes: 3952554603.408
num_examples: 23778
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 3952554603.408
num_examples: 23778
- name: academicodec_hifi_24k_320d
num_bytes: 5923762216.848
num_examples: 23778
- name: audiodec_24k_320d
num_bytes: 5930095724.928
num_examples: 23778
- name: dac_16k
num_bytes: 3954367438.128
num_examples: 23778
- name: dac_24k
num_bytes: 5930095724.928
num_examples: 23778
- name: dac_44k
num_bytes: 10894130391.564
num_examples: 23778
- name: encodec_24k
num_bytes: 5930140332.456
num_examples: 23778
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 3953549665.152
num_examples: 23778
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 3953549665.152
num_examples: 23778
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 3953549665.152
num_examples: 23778
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 3953549665.152
num_examples: 23778
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 3953549665.152
num_examples: 23778
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 3953549665.152
num_examples: 23778
- name: speech_tokenizer_16k
num_bytes: 3964348491.408
num_examples: 23778
download_size: 72248199219
dataset_size: 78197345249.96
---
# Dataset Card for "covost2_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abzzer/Social-Media-Post-Relevance | ---
license: mit
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/7b29f2d3 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1339
dataset_size: 184
---
# Dataset Card for "7b29f2d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OpenRL/DeepFakeFace | ---
license: openrail
task_categories:
- image-to-image
language:
- en
tags:
- deepfake
- diffusion model
pretty_name: DeepFakeFace'
---
```
---
license: apache-2.0
---
```
The dataset accompanying the paper
"Robustness and Generalizability of Deepfake Detection: A Study with Diffusion Models".
[[Website](https://sites.google.com/view/deepfakeface/home)] [[paper](https://arxiv.org/abs/2309.02218)] [[GitHub](https://github.com/OpenRL-Lab/DeepFakeFace)].
### Introduction
Welcome to the **DeepFakeFace (DFF)** dataset! Here we present a meticulously curated collection of artificial celebrity faces, crafted using cutting-edge diffusion models.
Our aim is to tackle the rising challenge posed by deepfakes in today's digital landscape.
Here are some example images in our dataset:

Our proposed DeepFakeFace(DFF) dataset is generated by various diffusion models, aiming to protect the privacy of celebrities.
There are four zip files in our dataset and each file contains 30,000 images.
We maintain the same directory structure as the IMDB-WIKI dataset where real images are selected.
- inpainting.zip is generated by the Stable Diffusion Inpainting model.
- insight.zip is generated by the InsightFace toolbox.
- text2img.zip is generated by Stable Diffusion V1.5
- wiki.zip contains original real images selected from the IMDB-WIKI dataset.
### DeepFake Dataset Compare
We compare our dataset with previous datasets here:

### Experimental Results
Performance of RECCE across different generators, measured in terms of Acc (%), AUC (%), and EER (%):

Robustness evaluation in terms of ACC(%), AUC (%) and EER(%):

### Cite
Please cite our paper if you use our codes or our dataset in your own work:
```
@misc{song2023robustness,
title={Robustness and Generalizability of Deepfake Detection: A Study with Diffusion Models},
author={Haixu Song and Shiyu Huang and Yinpeng Dong and Wei-Wei Tu},
year={2023},
eprint={2309.02218},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
|
DBQ/Mr.Porter.Product.prices.Russia | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Russia - Mr Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Mr Porter
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 8722278
num_examples: 26612
download_size: 2022635
dataset_size: 8722278
---
# Mr Porter web scraped data
## About the website
**Mr Porter** is a prominent operator in the **online retail industry** within the EMEA region, specifically in **Russia**. The E-commerce industry in Russia is growing rapidly, amidst the increasing tech-savviness and online shopping habits of consumers. Mr Porters foundation on sophisticated technology offers them a competitive edge, especially in tailoring to Russias vast and diverse consumer base. **E-commerce**, particularly **online fashion retailing**, is emerging as a dominant industry, influenced by factors like advanced technology, easy access to the internet, and changing consumer patterns. The dataset under observation contains **Ecommerce product-list page (PLP) data** pertaining to Mr Porters operations in Russia.
## Link to **dataset**
[Russia - Mr Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Mr%20Porter%20Product-prices%20Russia/r/reckLATEvXV9HEd9d)
|
turkish-nlp-suite/beyazperde-top-300-movie-reviews | ---
language:
- tr
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: BeyazPerde Top 300 Movie Reviews
---
# Dataset Card for turkish-nlp-suite/beyazperde-top-300-movie-reviews
<img src="https://raw.githubusercontent.com/turkish-nlp-suite/.github/main/profile/beyazPerde.png" width="20%" height="20%">
## Dataset Description
- **Repository:** [BeyazPerde Top 300 Movie Reviews](https://github.com/turkish-nlp-suite/BeyazPerde-Movie-Reviews/)
- **Paper:** [ACL link](https://aclanthology.org/2023.acl-long.768/)
- **Dataset:** BeyazPerde Top 300 Movie Reviews
- **Domain:** Social Media
### Dataset Summary
Beyazperde Movie Reviews offers Turkish sentiment analysis datasets that is scraped from popular movie reviews website Beyazperde.com. Top 300 Movies include audience reviews about best 300 movies of all the time. Here's the star rating distribution:
| star rating | count |
|---|---|
| 0.5 | 1.657 |
| 1.0 | 535 |
| 1.5 | 273 |
| 2.0 | 608 |
| 2.5 | 2.439 |
| 3.0 |2.277 |
| 3.5 | 5.550 |
| 4.0 | 13.248 |
| 4.5 | 10.077 |
| 5.0 | 17.351 |
| total | 54.015 |
As one sees, this dataset is highly unbalanced, number of 4 and 5 star ratings are much higher than 0, 1, 2 and 3 star reviews. This dataset offers the challenge of understanding the sentiment in a refined way, dissecting the positive sentiment into "very positive" or "okayish positive".
### Dataset Instances
An instance of this dataset looks as follows:
```
{
"movie": "Bay Evet",
"text": "Tam kıvamında çok keyifli bir film",
"rating": 4
}
```
### Data Split
| name |train|validation|test|
|---------|----:|---:|---:|
|BeyazPerde Top 300 Movie Reviews|44015|5000|5000|
### Citation
This work is supported by Google Developer Experts Program. Part of Duygu 2022 Fall-Winter collection, "Turkish NLP with Duygu"/ "Duygu'yla Türkçe NLP". All rights reserved. If you'd like to use this dataset in your own work, please kindly cite [A Diverse Set of Freely Available Linguistic Resources for Turkish](https://aclanthology.org/2023.acl-long.768/) :
```
@inproceedings{altinok-2023-diverse,
title = "A Diverse Set of Freely Available Linguistic Resources for {T}urkish",
author = "Altinok, Duygu",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.768",
pages = "13739--13750",
abstract = "This study presents a diverse set of freely available linguistic resources for Turkish natural language processing, including corpora, pretrained models and education material. Although Turkish is spoken by a sizeable population of over 80 million people, Turkish linguistic resources for natural language processing remain scarce. In this study, we provide corpora to allow practitioners to build their own applications and pretrained models that would assist industry researchers in creating quick prototypes. The provided corpora include named entity recognition datasets of diverse genres, including Wikipedia articles and supplement products customer reviews. In addition, crawling e-commerce and movie reviews websites, we compiled several sentiment analysis datasets of different genres. Our linguistic resources for Turkish also include pretrained spaCy language models. To the best of our knowledge, our models are the first spaCy models trained for the Turkish language. Finally, we provide various types of education material, such as video tutorials and code examples, that can support the interested audience on practicing Turkish NLP. The advantages of our linguistic resources are three-fold: they are freely available, they are first of their kind, and they are easy to use in a broad range of implementations. Along with a thorough description of the resource creation process, we also explain the position of our resources in the Turkish NLP world.",
}
```
|
nikolaalx/train-json | ---
license: apache-2.0
---
|
WUYONGF/ikun | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2930652.0
num_examples: 20
download_size: 2465736
dataset_size: 2930652.0
---
# Dataset Card for "ikun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_232 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22610275488.25
num_examples: 235406
download_size: 20883870881
dataset_size: 22610275488.25
---
# Dataset Card for "chunk_232"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tsutsuji_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tsutsuji (Pokémon)
This is the dataset of tsutsuji (Pokémon), containing 225 images and their tags.
The core tags of this character are `long_hair, brown_hair, twintails, red_eyes, breasts, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 225 | 186.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsutsuji_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 225 | 121.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsutsuji_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 482 | 231.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsutsuji_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 225 | 172.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsutsuji_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 482 | 303.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsutsuji_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tsutsuji_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, collared_shirt, grey_dress, pink_pantyhose, short_sleeves, solo, white_shirt, full_body, looking_at_viewer, open_mouth, pink_ascot, simple_background, blush, eyelashes, hand_up, mary_janes, standing, white_background, black_footwear, hair_rings, hand_on_hip, index_finger_raised, necktie |
| 1 | 17 |  |  |  |  |  | 1girl, pantyhose, grey_dress, ascot, pokemon_(creature), short_sleeves, hair_pulled_back, looking_at_viewer, mary_janes, smile, blush, white_shirt, closed_mouth, collared_shirt, full_body, hand_on_hip, open_mouth |
| 2 | 5 |  |  |  |  |  | 1girl, blush, hair_pulled_back, pink_pantyhose, solo, ascot, grey_dress, open_mouth, necktie, looking_at_viewer, panties_under_pantyhose |
| 3 | 16 |  |  |  |  |  | elbow_gloves, 1girl, looking_at_viewer, witch_hat, earrings, black_dress, black_gloves, black_headwear, smile, blush, solo, halloween, eyelashes, pokemon_(creature), pantyhose |
| 4 | 15 |  |  |  |  |  | 1girl, blush, hetero, 1boy, nipples, solo_focus, penis, sex, nude, open_mouth, vaginal, hair_pulled_back, pink_eyes, pussy, spread_legs, sweat, censored, lying, navel, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collared_shirt | grey_dress | pink_pantyhose | short_sleeves | solo | white_shirt | full_body | looking_at_viewer | open_mouth | pink_ascot | simple_background | blush | eyelashes | hand_up | mary_janes | standing | white_background | black_footwear | hair_rings | hand_on_hip | index_finger_raised | necktie | pantyhose | ascot | pokemon_(creature) | hair_pulled_back | smile | closed_mouth | panties_under_pantyhose | elbow_gloves | witch_hat | earrings | black_dress | black_gloves | black_headwear | halloween | hetero | 1boy | nipples | solo_focus | penis | sex | nude | vaginal | pink_eyes | pussy | spread_legs | sweat | censored | lying | navel | medium_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------------|:-----------------|:----------------|:-------|:--------------|:------------|:--------------------|:-------------|:-------------|:--------------------|:--------|:------------|:----------|:-------------|:-----------|:-------------------|:-----------------|:-------------|:--------------|:----------------------|:----------|:------------|:--------|:---------------------|:-------------------|:--------|:---------------|:--------------------------|:---------------|:------------|:-----------|:--------------|:---------------|:-----------------|:------------|:---------|:-------|:----------|:-------------|:--------|:------|:-------|:----------|:------------|:--------|:--------------|:--------|:-----------|:--------|:--------|:-----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | X | | X | | X | X | X | X | | | X | | | X | | | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | | X | | | X | X | | | X | | | | | | | | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | | | | | X | | | X | | | | X | X | | | | | | | | | | X | | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Team-PIXEL/rendered-wikipedia-english | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-3.0
- gfdl
multilinguality:
- monolingual
pretty_name: Team-PIXEL/rendered-wikipedia-english
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- masked-auto-encoding
- rendered-language-modelling
task_ids:
- masked-auto-encoding
- rendered-language-modeling
paperswithcode_id: null
---
# Dataset Card for Team-PIXEL/rendered-wikipedia-english
## Dataset Description
- **Homepage:** [https://github.com/xplip/pixel](https://github.com/xplip/pixel)
- **Repository:** [https://github.com/xplip/pixel](https://github.com/xplip/pixel)
- **Paper:** [Language Modelling with Pixels](https://arxiv.org/abs/2207.06991)
- **Point of Contact:** [Phillip Rust](mailto:p.rust@di.ku.dk)
- **Size of downloaded dataset files:** 125.66 GB
- **Size of the generated dataset:** 125.56 GB
- **Total amount of disk used:** 251.22 GB
### Dataset Summary
This dataset contains the full English Wikipedia from February 1, 2018, rendered into images of 16x8464 resolution.
The original text dataset was built from a [Wikipedia dump](https://dumps.wikimedia.org/). Each example in the original *text* dataset contained the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.). Each *rendered* example contains a subset of one full article. This rendered English Wikipedia was used to train the [PIXEL](https://huggingface.co/Team-PIXEL/pixel-base) model introduced in the paper [Language Modelling with Pixels](https://arxiv.org/abs/2207.06991) by Phillip Rust, Jonas F. Lotz, Emanuele Bugliarello, Elizabeth Salesky, Miryam de Lhoneux, and Desmond Elliott.
The original Wikipedia text dataset was rendered article-by-article into 11.4M examples containing approximately 2B words in total. The dataset is stored as a collection of 338 parquet files.
It was rendered using the script openly available at [https://github.com/xplip/pixel/blob/main/scripts/data/prerendering/prerender_wikipedia.py](https://github.com/xplip/pixel/blob/main/scripts/data/prerendering/prerender_wikipedia.py). The text renderer uses a PyGame backend and a collection of merged Google Noto Sans fonts. The PyGame backend does not support complex text layouts (e.g. ligatures and right-to-left scripts) or emoji, so occurrences of such text in the Wikipedia data have not been rendered accurately.
Each example consists of a "pixel_values" field which stores a 16x8464 (height, width) grayscale image containing the rendered text, and an integer value "num_patches" which stores how many image patches (when splitting the image into 529 non-overlapping patches of resolution 16x16 pixels) in the associated images contain actual text, i.e. are neither blank (fully white) nor are the fully black end-of-sequence patch.
You can load the dataset as follows:
```python
from datasets import load_dataset
# Download the full dataset to disk
load_dataset("Team-PIXEL/rendered-wikipedia-english", split="train")
# Stream the dataset directly from the hub
load_dataset("Team-PIXEL/rendered-wikipedia-english", split="train", streaming=True)
```
## Dataset Structure
### Data Instances
- **Size of downloaded dataset files:** 125.66 GB
- **Size of the generated dataset:** 125.56 GB
- **Total amount of disk used:** 251.22 GB
An example of 'train' looks as follows.
```
{
"pixel_values": <PIL.PngImagePlugin.PngImageFile image mode=L size=8464x16
"num_patches": "469"
}
```
### Data Fields
The data fields are the same among all splits.
- `pixel_values`: an `Image` feature.
- `num_patches`: a `Value(dtype="int64")` feature.
### Data Splits
|train|
|:----|
|11446535|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Most of Wikipedia's text and many of its images are co-licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC BY-SA) and the GNU Free Documentation License (GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text.
### Citation Information
```bibtex
@article{rust-etal-2022-pixel,
title={Language Modelling with Pixels},
author={Phillip Rust and Jonas F. Lotz and Emanuele Bugliarello and Elizabeth Salesky and Miryam de Lhoneux and Desmond Elliott},
journal={arXiv preprint},
year={2022},
url={https://arxiv.org/abs/2207.06991}
}
```
### Contact Person
This dataset was added by Phillip Rust.
Github: [@xplip](https://github.com/xplip)
Twitter: [@rust_phillip](https://twitter.com/rust_phillip) |
ibranze/araproje_mmlu_tr_conf_gpt2_nearestscore_true_y | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 83939
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_conf_gpt2_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_Diabetes130US_gosdt_l512_d3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 487407664
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_Diabetes130US_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Azure99__blossom-v2-3b | ---
pretty_name: Evaluation run of Azure99/blossom-v2-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v2-3b](https://huggingface.co/Azure99/blossom-v2-3b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v2-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T18:36:49.609194](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-3b/blob/main/results_2023-09-16T18-36-49.609194.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.034395973154362415,\n\
\ \"em_stderr\": 0.0018663495487686885,\n \"f1\": 0.11167470637583889,\n\
\ \"f1_stderr\": 0.0023912000923338094,\n \"acc\": 0.2966551039299941,\n\
\ \"acc_stderr\": 0.007917209289296998\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.034395973154362415,\n \"em_stderr\": 0.0018663495487686885,\n\
\ \"f1\": 0.11167470637583889,\n \"f1_stderr\": 0.0023912000923338094\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948054\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5880031570639306,\n \"acc_stderr\": 0.013833112857645942\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v2-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T18_36_49.609194
path:
- '**/details_harness|drop|3_2023-09-16T18-36-49.609194.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T18-36-49.609194.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T18_36_49.609194
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-36-49.609194.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-36-49.609194.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:22:00.974376.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:22:00.974376.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T18_36_49.609194
path:
- '**/details_harness|winogrande|5_2023-09-16T18-36-49.609194.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T18-36-49.609194.parquet'
- config_name: results
data_files:
- split: 2023_08_09T15_22_00.974376
path:
- results_2023-08-09T15:22:00.974376.parquet
- split: 2023_09_16T18_36_49.609194
path:
- results_2023-09-16T18-36-49.609194.parquet
- split: latest
path:
- results_2023-09-16T18-36-49.609194.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v2-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Azure99/blossom-v2-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Azure99/blossom-v2-3b](https://huggingface.co/Azure99/blossom-v2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v2-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T18:36:49.609194](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v2-3b/blob/main/results_2023-09-16T18-36-49.609194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.034395973154362415,
"em_stderr": 0.0018663495487686885,
"f1": 0.11167470637583889,
"f1_stderr": 0.0023912000923338094,
"acc": 0.2966551039299941,
"acc_stderr": 0.007917209289296998
},
"harness|drop|3": {
"em": 0.034395973154362415,
"em_stderr": 0.0018663495487686885,
"f1": 0.11167470637583889,
"f1_stderr": 0.0023912000923338094
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948054
},
"harness|winogrande|5": {
"acc": 0.5880031570639306,
"acc_stderr": 0.013833112857645942
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
weqweasdas/open_chat_0106_ultra_feedback_n32 | ---
configs:
- config_name: default
data_files:
- split: ds0
path: data/ds0-*
- split: ds1
path: data/ds1-*
- split: ds2
path: data/ds2-*
- split: ds3
path: data/ds3-*
dataset_info:
features:
- name: responses
sequence: string
- name: prompt
dtype: string
- name: old_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: ds0
num_bytes: 628481099
num_examples: 14992
- name: ds1
num_bytes: 627666728
num_examples: 14992
- name: ds2
num_bytes: 632353708
num_examples: 14992
- name: ds3
num_bytes: 628262601
num_examples: 14992
download_size: 1030945655
dataset_size: 2516764136
---
# Dataset Card for "open_chat_0106_ultra_feedback_n32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/1722_Hours_Near_field_Conference_Speech_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Mandarin(China) Near-field Conference speech dataset, collected the output by AU central console mixer in real speech scenes. It has a natural pronunciation without environmental noise almost, covers a variety of topics. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1066?source=Huggingface
# Specifications
## Format
44.1kHz, 16bit, wav, dual channel.
## Recording environment
Mixed
## Recording content
lectures on science and technology, training, publicity, etc.
## Device
AU Center Console Mixer
## Country
China(CHN)
## Language
Mandarin
## Features of annotation
annotating for the transcription text, speaker identification and gender
## Accuracy Rate
Sentence Accuracy Rate(SAR) 97%
# Licensing Information
Commercial License
|
GEM-submissions/lewtun__this-is-a-test-name__1655913794 | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name
|
autoevaluate/autoeval-eval-inverse-scaling__quote-repetition-inverse-scaling__quot-3aff83-1695059592 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/quote-repetition
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-1.3b_eval
metrics: []
dataset_name: inverse-scaling/quote-repetition
dataset_config: inverse-scaling--quote-repetition
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-1.3b_eval
* Dataset: inverse-scaling/quote-repetition
* Config: inverse-scaling--quote-repetition
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
raflibagas/PsychologistSamhog | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 460937
num_examples: 1000
download_size: 143081
dataset_size: 460937
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Timbrt/SciOL-text | ---
license: cc-by-4.0
language:
- en
size_categories:
- 10B<n<100B
pretty_name: Scientific Openly-Licensed Publications - Text
---
# Scientific Openly-Licensed Publications
This repository contains companion material for the following [publication](https://openaccess.thecvf.com/content/WACV2024/papers/Tarsi_SciOL_and_MuLMS-Img_Introducing_a_Large-Scale_Multimodal_Scientific_Dataset_and_WACV_2024_paper.pdf):
> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. **SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain.** WACV 2024.
Please cite this paper if using the dataset, and direct any questions regarding the dataset
to [Tim Tarsi](mailto:tim.tarsi@gmail.com)
## Summary
Scientific Openly-Licensed Publications (SciOL) is the largest openly-licensed pre-training corpus for multimodal models in the scientific domain, covering multiple sciences including materials science, physics, and computer science. It consists of over 2.7M scientific scientific publications converted into semi-structured data. SciOL contains over 14 Billion tokens of extracted and structured text.
**Note: This repository only contains the textual data of SciOL. For the figures and captions see:**
[SciOL-CI](https://huggingface.co/datasets/Timbrt/SciOL-CI)
## Data Format
We provide the annotations of our dataset in the JSON format. Files are grouped and compressed as zip files. We provide a basic index to find annotations by DOI, PMID or DOAJ id and keywords.
## Annotation Schema
Annotations are structured as in the following schema:
```
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"doi": {
"type": "string"
},
"keywords": {
"type": "array",
"items": {
"type": "string"
}
},
"license": {
"type": "string"
},
"article": {
"type": "object",
"properties": {
"title": {
"type": "string"
},
"authors": {
"type": "array",
"items": {
"type": "string"
}
},
"abstract": {
"type": "string"
},
"body_text": {
"type": "string"
},
"bibliography": {
"type": "string"
}
}
}
}
}
```
## Citation
If you use our dataset in your scientific, please cite our paper:
```
@InProceedings{Tarsi_2024_WACV,
author = {Tarsi, Tim and Adel, Heike and Metzen, Jan Hendrik and Zhang, Dan and Finco, Matteo and Friedrich, Annemarie},
title = {SciOL and MuLMS-Img: Introducing a Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {January},
year = {2024},
pages = {4560-4571}
}
```
## License
The SciOL corpus is released under the [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license. |
dmrau/cqudubstack-unix | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 72357
num_examples: 1072
- name: corpus
num_bytes: 46102756
num_examples: 47382
download_size: 24571026
dataset_size: 46175113
---
# Dataset Card for "cqudubstack-unix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lukejagg/dog-test | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1033492.0
num_examples: 2
- name: test
num_bytes: 1033544.0
num_examples: 2
download_size: 2066534
dataset_size: 2067036.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mask-distilled-one-sec-cv12/chunk_234 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1136473296
num_examples: 223188
download_size: 1161235968
dataset_size: 1136473296
---
# Dataset Card for "chunk_234"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamlouly/enron_spam_data | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_formulae__Dorflan | ---
pretty_name: Evaluation run of formulae/Dorflan
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [formulae/Dorflan](https://huggingface.co/formulae/Dorflan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_formulae__Dorflan\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T08:11:23.880796](https://huggingface.co/datasets/open-llm-leaderboard/details_formulae__Dorflan/blob/main/results_2023-10-24T08-11-23.880796.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.16851929530201343,\n\
\ \"em_stderr\": 0.0038334566477606843,\n \"f1\": 0.2636723993288601,\n\
\ \"f1_stderr\": 0.003974880412044246,\n \"acc\": 0.36495772729693454,\n\
\ \"acc_stderr\": 0.007112996736385248\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.16851929530201343,\n \"em_stderr\": 0.0038334566477606843,\n\
\ \"f1\": 0.2636723993288601,\n \"f1_stderr\": 0.003974880412044246\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401502038\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620292\n\
\ }\n}\n```"
repo_url: https://huggingface.co/formulae/Dorflan
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|arc:challenge|25_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T08_11_23.880796
path:
- '**/details_harness|drop|3_2023-10-24T08-11-23.880796.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T08-11-23.880796.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T08_11_23.880796
path:
- '**/details_harness|gsm8k|5_2023-10-24T08-11-23.880796.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T08-11-23.880796.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hellaswag|10_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T08-07-52.776244.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T08-07-52.776244.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T08_11_23.880796
path:
- '**/details_harness|winogrande|5_2023-10-24T08-11-23.880796.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T08-11-23.880796.parquet'
- config_name: results
data_files:
- split: 2023_10_04T08_07_52.776244
path:
- results_2023-10-04T08-07-52.776244.parquet
- split: 2023_10_24T08_11_23.880796
path:
- results_2023-10-24T08-11-23.880796.parquet
- split: latest
path:
- results_2023-10-24T08-11-23.880796.parquet
---
# Dataset Card for Evaluation run of formulae/Dorflan
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/formulae/Dorflan
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [formulae/Dorflan](https://huggingface.co/formulae/Dorflan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_formulae__Dorflan",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T08:11:23.880796](https://huggingface.co/datasets/open-llm-leaderboard/details_formulae__Dorflan/blob/main/results_2023-10-24T08-11-23.880796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.16851929530201343,
"em_stderr": 0.0038334566477606843,
"f1": 0.2636723993288601,
"f1_stderr": 0.003974880412044246,
"acc": 0.36495772729693454,
"acc_stderr": 0.007112996736385248
},
"harness|drop|3": {
"em": 0.16851929530201343,
"em_stderr": 0.0038334566477606843,
"f1": 0.2636723993288601,
"f1_stderr": 0.003974880412044246
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401502038
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620292
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
antoinelb7/alloprof | ---
license: mit
task_categories:
- question-answering
- text-retrieval
language:
- fr
tags:
- education
size_categories:
- 10K<n<100K
---
# Alloprof dataset
This is the dataset refered to in our paper:
Alloprof: a new French question-answer education dataset and its use in an information retrieval case study (https://arxiv.org/abs/2302.07738)
This dataset was provided by [AlloProf](https://www.alloprof.qc.ca/), an organisation in Quebec, Canada offering resources and a help forum curated by a large number of teachers to students on all subjects taught from in primary and secondary school.
Raw data on questions is available in the following files:
- `data/questions/categories.json`: subjects and their corresponding id
- `data/questions/comments.json`: explanation (answer) data
- `data/questions/discussions.json`: question data
- `data/questions/grades.json`: grades and their corresponding id
- `data/questions/roles.json`: information about the user type for each user id
Raw data on reference pages is available in the following files:
- `data/pages/page-content-en.json`: data for the reference pages in English
- `data/pages/page-content-fr.json`: data for the reference pages in French
The data can be parsed and structured using the script `scripts/parse_data.py` to create the file `data/alloprof.csv` with the following columns:
- `id` (str) : Id of the document
- `url` (str) : URL of the document
- `text` (str) : Parsed text of the document
- `language` (str) : Either "fr" or "en", the language of the document
- `user` (int) : Id corresponding to the user who asked the question
- `images` (str) : ";" separated list of URLs of images contained in the document
- `relevant` (str) : ";" separated list of document ids appearing as links in the explanation to that document. For files, this will always be empty as there are no corresponding explanation
- `is_query` (bool) : If this document is a question
- `subject` (str) : ";" separated list of school subjects the document is related to
- `grade` (str) : ";" separated list of school grade levels the document is related to
- `possible` (str) : ";" separated list of possible documents ids this document may refer to. This list corresponds to every document of the same subject and grade. For files, this will always be empty to speed up reading and writing
The `possible` column depends on arguments passed to the scripts to add related subjects, and lower and higher grade levels to the possible documents (see paper).
Also note that the provided `alloprof.csv` file is stored with git lfs and can be pulled with `git lfs install && git lfs pull`.
For images, a script to download them is available as `scripts/download_images.py`.
If you have any questions, don't hesitate to mail us at antoine.lefebvre-brossard@mila.quebec.
**Please cite our work as:**
```
@misc{lef23,
doi = {10.48550/ARXIV.2302.07738},
url = {https://arxiv.org/abs/2302.07738},
author = {Lefebvre-Brossard, Antoine and Gazaille, Stephane and Desmarais, Michel C.},
keywords = {Computation and Language (cs.CL), Information Retrieval (cs.IR), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Alloprof: a new French question-answer education dataset and its use in an information retrieval case study},
publisher = {arXiv},
year = {2023},
copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International}
}
``` |
tanoManzo/mimic_attitude_dataset | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 348621
num_examples: 223
download_size: 213133
dataset_size: 348621
---
# Dataset Card for "mimic_attitude_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ColinCcz/fake-news-80k | ---
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 186269308
num_examples: 73139
- name: test
num_bytes: 22827228
num_examples: 9142
- name: valididation
num_bytes: 23737967
num_examples: 9143
download_size: 143742642
dataset_size: 232834503
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valididation
path: data/valididation-*
---
|
gagan3012/dolphin-retrival-EXAMS-QA-qrels | ---
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int32
splits:
- name: test
num_bytes: 49658
num_examples: 2672
download_size: 17141
dataset_size: 49658
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
arieg/bw_spec_cls_4_00_s_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '140'
'2': '2'
'3': '5'
splits:
- name: train
num_bytes: 44003467.0
num_examples: 800
- name: test
num_bytes: 4361761.0
num_examples: 80
download_size: 42377721
dataset_size: 48365228.0
---
# Dataset Card for "bw_spec_cls_4_00_s_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aehus/bumblebee_7 | ---
dataset_info:
features:
- name: new_output
dtype: string
- name: new_input
dtype: string
- name: new_instruction
dtype: string
splits:
- name: train
num_bytes: 5403116
num_examples: 5456
download_size: 2756573
dataset_size: 5403116
---
# Dataset Card for "bumblebee_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_kajdun__iubaris-13b-v3 | ---
pretty_name: Evaluation run of kajdun/iubaris-13b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kajdun/iubaris-13b-v3](https://huggingface.co/kajdun/iubaris-13b-v3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kajdun__iubaris-13b-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T10:44:57.625308](https://huggingface.co/datasets/open-llm-leaderboard/details_kajdun__iubaris-13b-v3/blob/main/results_2023-08-26T10%3A44%3A57.625308.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5457651905991454,\n\
\ \"acc_stderr\": 0.03462957237621476,\n \"acc_norm\": 0.5496206901026076,\n\
\ \"acc_norm_stderr\": 0.03461040607243729,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4860621835708466,\n\
\ \"mc2_stderr\": 0.015429990225329837\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.014518421825670456,\n\
\ \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345427\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.625273849830711,\n\
\ \"acc_stderr\": 0.004830628620181031,\n \"acc_norm\": 0.8177653853813981,\n\
\ \"acc_norm_stderr\": 0.003852488177553968\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330877,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330877\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622842,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622842\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7247706422018348,\n \"acc_stderr\": 0.019149093743155196,\n \"\
acc_norm\": 0.7247706422018348,\n \"acc_norm_stderr\": 0.019149093743155196\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n\
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.037466683254700206,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.037466683254700206\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7318007662835249,\n\
\ \"acc_stderr\": 0.015842430835269424,\n \"acc_norm\": 0.7318007662835249,\n\
\ \"acc_norm_stderr\": 0.015842430835269424\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194625,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194625\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4002607561929596,\n\
\ \"acc_stderr\": 0.012513582529136215,\n \"acc_norm\": 0.4002607561929596,\n\
\ \"acc_norm_stderr\": 0.012513582529136215\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969758,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969758\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.0301164262965406,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.0301164262965406\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4860621835708466,\n\
\ \"mc2_stderr\": 0.015429990225329837\n }\n}\n```"
repo_url: https://huggingface.co/kajdun/iubaris-13b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|arc:challenge|25_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hellaswag|10_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T10:44:57.625308.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T10:44:57.625308.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T10:44:57.625308.parquet'
- config_name: results
data_files:
- split: 2023_08_26T10_44_57.625308
path:
- results_2023-08-26T10:44:57.625308.parquet
- split: latest
path:
- results_2023-08-26T10:44:57.625308.parquet
---
# Dataset Card for Evaluation run of kajdun/iubaris-13b-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kajdun/iubaris-13b-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kajdun/iubaris-13b-v3](https://huggingface.co/kajdun/iubaris-13b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kajdun__iubaris-13b-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T10:44:57.625308](https://huggingface.co/datasets/open-llm-leaderboard/details_kajdun__iubaris-13b-v3/blob/main/results_2023-08-26T10%3A44%3A57.625308.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5457651905991454,
"acc_stderr": 0.03462957237621476,
"acc_norm": 0.5496206901026076,
"acc_norm_stderr": 0.03461040607243729,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4860621835708466,
"mc2_stderr": 0.015429990225329837
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.014518421825670456,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345427
},
"harness|hellaswag|10": {
"acc": 0.625273849830711,
"acc_stderr": 0.004830628620181031,
"acc_norm": 0.8177653853813981,
"acc_norm_stderr": 0.003852488177553968
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622842,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622842
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7247706422018348,
"acc_stderr": 0.019149093743155196,
"acc_norm": 0.7247706422018348,
"acc_norm_stderr": 0.019149093743155196
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.037466683254700206,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.037466683254700206
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7318007662835249,
"acc_stderr": 0.015842430835269424,
"acc_norm": 0.7318007662835249,
"acc_norm_stderr": 0.015842430835269424
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194625,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194625
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4002607561929596,
"acc_stderr": 0.012513582529136215,
"acc_norm": 0.4002607561929596,
"acc_norm_stderr": 0.012513582529136215
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969758,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.0301164262965406,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.0301164262965406
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4860621835708466,
"mc2_stderr": 0.015429990225329837
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/bleach | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of Bleach
This is the image base of bangumi Bleach, we detected 181 characters, 30903 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:----------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|
| 0 | 4514 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 48 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 84 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 647 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 179 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 2597 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 2092 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 178 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 2071 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 125 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 143 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 46 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 274 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 114 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 64 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 337 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 170 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 180 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 26 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 204 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 125 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 37 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 276 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 353 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 1148 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 49 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 148 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 326 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 32 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 37 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 1169 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 55 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 46 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 95 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 251 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 96 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 69 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 239 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 175 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 57 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 23 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 29 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 32 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 221 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 43 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 759 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 163 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 104 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 53 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 68 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 22 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 55 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 116 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 44 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 39 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 98 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 183 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 37 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 61 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 23 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 41 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 76 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 54 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 245 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 147 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 29 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 12 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 115 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 30 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 327 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 301 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 18 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 60 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 86 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 95 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 178 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 34 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 134 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 28 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 75 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 53 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 65 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 51 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 385 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 15 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 182 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 98 | [Download](86/dataset.zip) |  |  |  |  |  |  |  |  |
| 87 | 85 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 99 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 44 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 65 | [Download](90/dataset.zip) |  |  |  |  |  |  |  |  |
| 91 | 58 | [Download](91/dataset.zip) |  |  |  |  |  |  |  |  |
| 92 | 747 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 1188 | [Download](93/dataset.zip) |  |  |  |  |  |  |  |  |
| 94 | 132 | [Download](94/dataset.zip) |  |  |  |  |  |  |  |  |
| 95 | 30 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 417 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 117 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| 98 | 101 | [Download](98/dataset.zip) |  |  |  |  |  |  |  |  |
| 99 | 20 | [Download](99/dataset.zip) |  |  |  |  |  |  |  |  |
| 100 | 16 | [Download](100/dataset.zip) |  |  |  |  |  |  |  |  |
| 101 | 21 | [Download](101/dataset.zip) |  |  |  |  |  |  |  |  |
| 102 | 30 | [Download](102/dataset.zip) |  |  |  |  |  |  |  |  |
| 103 | 50 | [Download](103/dataset.zip) |  |  |  |  |  |  |  |  |
| 104 | 52 | [Download](104/dataset.zip) |  |  |  |  |  |  |  |  |
| 105 | 70 | [Download](105/dataset.zip) |  |  |  |  |  |  |  |  |
| 106 | 21 | [Download](106/dataset.zip) |  |  |  |  |  |  |  |  |
| 107 | 80 | [Download](107/dataset.zip) |  |  |  |  |  |  |  |  |
| 108 | 22 | [Download](108/dataset.zip) |  |  |  |  |  |  |  |  |
| 109 | 18 | [Download](109/dataset.zip) |  |  |  |  |  |  |  |  |
| 110 | 89 | [Download](110/dataset.zip) |  |  |  |  |  |  |  |  |
| 111 | 121 | [Download](111/dataset.zip) |  |  |  |  |  |  |  |  |
| 112 | 139 | [Download](112/dataset.zip) |  |  |  |  |  |  |  |  |
| 113 | 38 | [Download](113/dataset.zip) |  |  |  |  |  |  |  |  |
| 114 | 31 | [Download](114/dataset.zip) |  |  |  |  |  |  |  |  |
| 115 | 68 | [Download](115/dataset.zip) |  |  |  |  |  |  |  |  |
| 116 | 24 | [Download](116/dataset.zip) |  |  |  |  |  |  |  |  |
| 117 | 34 | [Download](117/dataset.zip) |  |  |  |  |  |  |  |  |
| 118 | 221 | [Download](118/dataset.zip) |  |  |  |  |  |  |  |  |
| 119 | 11 | [Download](119/dataset.zip) |  |  |  |  |  |  |  |  |
| 120 | 56 | [Download](120/dataset.zip) |  |  |  |  |  |  |  |  |
| 121 | 26 | [Download](121/dataset.zip) |  |  |  |  |  |  |  |  |
| 122 | 92 | [Download](122/dataset.zip) |  |  |  |  |  |  |  |  |
| 123 | 31 | [Download](123/dataset.zip) |  |  |  |  |  |  |  |  |
| 124 | 24 | [Download](124/dataset.zip) |  |  |  |  |  |  |  |  |
| 125 | 26 | [Download](125/dataset.zip) |  |  |  |  |  |  |  |  |
| 126 | 46 | [Download](126/dataset.zip) |  |  |  |  |  |  |  |  |
| 127 | 252 | [Download](127/dataset.zip) |  |  |  |  |  |  |  |  |
| 128 | 31 | [Download](128/dataset.zip) |  |  |  |  |  |  |  |  |
| 129 | 30 | [Download](129/dataset.zip) |  |  |  |  |  |  |  |  |
| 130 | 122 | [Download](130/dataset.zip) |  |  |  |  |  |  |  |  |
| 131 | 14 | [Download](131/dataset.zip) |  |  |  |  |  |  |  |  |
| 132 | 50 | [Download](132/dataset.zip) |  |  |  |  |  |  |  |  |
| 133 | 16 | [Download](133/dataset.zip) |  |  |  |  |  |  |  |  |
| 134 | 14 | [Download](134/dataset.zip) |  |  |  |  |  |  |  |  |
| 135 | 152 | [Download](135/dataset.zip) |  |  |  |  |  |  |  |  |
| 136 | 53 | [Download](136/dataset.zip) |  |  |  |  |  |  |  |  |
| 137 | 43 | [Download](137/dataset.zip) |  |  |  |  |  |  |  |  |
| 138 | 23 | [Download](138/dataset.zip) |  |  |  |  |  |  |  |  |
| 139 | 70 | [Download](139/dataset.zip) |  |  |  |  |  |  |  |  |
| 140 | 20 | [Download](140/dataset.zip) |  |  |  |  |  |  |  |  |
| 141 | 20 | [Download](141/dataset.zip) |  |  |  |  |  |  |  |  |
| 142 | 59 | [Download](142/dataset.zip) |  |  |  |  |  |  |  |  |
| 143 | 17 | [Download](143/dataset.zip) |  |  |  |  |  |  |  |  |
| 144 | 14 | [Download](144/dataset.zip) |  |  |  |  |  |  |  |  |
| 145 | 69 | [Download](145/dataset.zip) |  |  |  |  |  |  |  |  |
| 146 | 32 | [Download](146/dataset.zip) |  |  |  |  |  |  |  |  |
| 147 | 15 | [Download](147/dataset.zip) |  |  |  |  |  |  |  |  |
| 148 | 31 | [Download](148/dataset.zip) |  |  |  |  |  |  |  |  |
| 149 | 13 | [Download](149/dataset.zip) |  |  |  |  |  |  |  |  |
| 150 | 42 | [Download](150/dataset.zip) |  |  |  |  |  |  |  |  |
| 151 | 13 | [Download](151/dataset.zip) |  |  |  |  |  |  |  |  |
| 152 | 133 | [Download](152/dataset.zip) |  |  |  |  |  |  |  |  |
| 153 | 9 | [Download](153/dataset.zip) |  |  |  |  |  |  |  |  |
| 154 | 27 | [Download](154/dataset.zip) |  |  |  |  |  |  |  |  |
| 155 | 53 | [Download](155/dataset.zip) |  |  |  |  |  |  |  |  |
| 156 | 15 | [Download](156/dataset.zip) |  |  |  |  |  |  |  |  |
| 157 | 16 | [Download](157/dataset.zip) |  |  |  |  |  |  |  |  |
| 158 | 30 | [Download](158/dataset.zip) |  |  |  |  |  |  |  |  |
| 159 | 19 | [Download](159/dataset.zip) |  |  |  |  |  |  |  |  |
| 160 | 191 | [Download](160/dataset.zip) |  |  |  |  |  |  |  |  |
| 161 | 12 | [Download](161/dataset.zip) |  |  |  |  |  |  |  |  |
| 162 | 56 | [Download](162/dataset.zip) |  |  |  |  |  |  |  |  |
| 163 | 12 | [Download](163/dataset.zip) |  |  |  |  |  |  |  |  |
| 164 | 24 | [Download](164/dataset.zip) |  |  |  |  |  |  |  |  |
| 165 | 29 | [Download](165/dataset.zip) |  |  |  |  |  |  |  |  |
| 166 | 11 | [Download](166/dataset.zip) |  |  |  |  |  |  |  |  |
| 167 | 10 | [Download](167/dataset.zip) |  |  |  |  |  |  |  |  |
| 168 | 17 | [Download](168/dataset.zip) |  |  |  |  |  |  |  |  |
| 169 | 8 | [Download](169/dataset.zip) |  |  |  |  |  |  |  |  |
| 170 | 28 | [Download](170/dataset.zip) |  |  |  |  |  |  |  |  |
| 171 | 26 | [Download](171/dataset.zip) |  |  |  |  |  |  |  |  |
| 172 | 12 | [Download](172/dataset.zip) |  |  |  |  |  |  |  |  |
| 173 | 18 | [Download](173/dataset.zip) |  |  |  |  |  |  |  |  |
| 174 | 13 | [Download](174/dataset.zip) |  |  |  |  |  |  |  |  |
| 175 | 10 | [Download](175/dataset.zip) |  |  |  |  |  |  |  |  |
| 176 | 13 | [Download](176/dataset.zip) |  |  |  |  |  |  |  |  |
| 177 | 7 | [Download](177/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 178 | 10 | [Download](178/dataset.zip) |  |  |  |  |  |  |  |  |
| 179 | 21 | [Download](179/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 315 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
RUCAIBox/Simplification | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text2text-generation
task_ids:
- text-simplification
---
This is the simplification datasets collected by TextBox, including:
- WikiAuto + Turk/ASSET (wia-t).
The detail and leaderboard of each dataset can be found in [TextBox page](https://github.com/RUCAIBox/TextBox#dataset). |
InHawK/llama2-summarize-dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 22568341
num_examples: 500
download_size: 10759299
dataset_size: 22568341
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_mmlu_en_f4 | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 130579.0
num_examples: 250
download_size: 0
dataset_size: 130579.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_en_f4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thunder-rk/Stories-t5 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 158444.25
num_examples: 66
- name: test
num_bytes: 52814.75
num_examples: 22
download_size: 150839
dataset_size: 211259.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
camel-ai/ai_society | ---
license: cc-by-nc-4.0
language:
- en
tags:
- instruction-finetuning
pretty_name: CAMEL AI Society
task_categories:
- text-generation
arxiv: 2303.17760
extra_gated_prompt: "By using this data, you acknowledge and agree to utilize it solely for research purposes, recognizing that the dataset may contain inaccuracies due to its artificial generation through ChatGPT."
extra_gated_fields:
Name: text
Email: text
I will adhere to the terms and conditions of this dataset: checkbox
---
# **CAMEL: Communicative Agents for “Mind” Exploration of Large Scale Language Model Society**
- **Github:** https://github.com/lightaime/camel
- **Website:** https://www.camel-ai.org/
- **Arxiv Paper:** https://arxiv.org/abs/2303.17760
## Dataset Summary
AI Society dataset is composed of 25K conversations between two gpt-3.5-turbo agents. This dataset is obtained by running role-playing for a combination of 50 user roles and 50 assistant roles with each combination running over 10 tasks.
We provide two formats, one is "chat" format which is `ai_society_chat.tar.gz` file containing the conversational instruction following format. The other format is "instruction" format which is `ai_society_instructions.json`.
## Data Fields
**The data fields for instructions format (`ai_society_instructions.json`) are as follows:**
* `id`: {assistant\_role\_index}\_{user\_role\_index}\_{task\_index}, for example 001_002_003 refers to assistant role 1, user role 2, and task 3 from our text assistant role names, user role names and task text files.
* `role_1`: assistant role
* `role_2`: user role
* `original_task`: the general assigned task for the assistant and user to cooperate on.
* `specified_task`: the task after task specifier, this task is more specific than the original task.
* `role_1_response`: user response text before the instruction.
* `role_1_message_id`: message ID in the full raw conversation.
* `instruction`: describes the task the assistant is supposed to perform.
* `input`: provides further context or information for the requested instruction.
* `output`: the answer to the instruction as generated by 'gpt-3.5-turbo'
* `termination_reason`: refers to the reason of termination of the chat.
**The data fields for chat format (`ai_society_chat.tar.gz`) are as follows:**
* `input`: {assistant\_role\_index}\_{user\_role\_index}\_{task\_index}, for example 001_002_003 refers to assistant role 1, user role 2, and task 3 from our text assistant role names, user role names and task text files.
* `role_1`: assistant role
* `role_2`: user role
* `original_task`: the general assigned task for the assistant and user to cooperate on.
* `specified_task`: the task after task specifier, this task is more specific than the original task.
* `message_k`: refers to the k<sup>_th_</sup> message of the conversation.
* `role_type`: refers to whether the agent is an assistant or a user.
* `role_name`: refers to the assigned assistant/user role.
* `role`: refers to the role of the agent during the message for openai api. [usually not needed]
* `content`: refers to the content of the message.
* `termination_reason`: refers to the reason of termination of the chat.
* `num_messages`: refers to the total number of messages in the chat.
**Download in python**
```
from huggingface_hub import hf_hub_download
hf_hub_download(repo_id="camel-ai/ai_society", repo_type="dataset", filename="ai_society_chat.tar.gz",
local_dir="datasets/", local_dir_use_symlinks=False)
```
### Citation
```
@misc{li2023camel,
title={CAMEL: Communicative Agents for "Mind" Exploration of Large Scale Language Model Society},
author={Guohao Li and Hasan Abed Al Kader Hammoud and Hani Itani and Dmitrii Khizbullin and Bernard Ghanem},
year={2023},
eprint={2303.17760},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
## Disclaimer:
This data was synthetically generated by gpt-3.5-turbo and might contain incorrect information. The dataset is there only for research purposes.
---
license: cc-by-nc-4.0
---
|
adamjweintraut/bart-finetuned-eli5_precomputed_best_rouge_eval_2023-12-08_run.csv | ---
dataset_info:
features:
- name: q_id
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: predicted
dtype: string
- name: label
dtype: string
- name: rougeL_precision_min
dtype: float64
- name: rougeL_recall_min
dtype: float64
- name: rougeL_fmeasure_min
dtype: float64
- name: rougeL_precision_median
dtype: float64
- name: rougeL_recall_median
dtype: float64
- name: rougeL_fmeasure_median
dtype: float64
- name: rougeL_precision_max
dtype: float64
- name: rougeL_recall_max
dtype: float64
- name: rougeL_fmeasure_max
dtype: float64
- name: nli_label
dtype: string
- name: nli_plot_val
dtype: int64
- name: nli_score
dtype: float32
- name: sent_sim
dtype: float32
splits:
- name: train
num_bytes: 15592501
num_examples: 1250
download_size: 9700492
dataset_size: 15592501
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DanielDimas/Rambo | ---
license: openrail
---
|
ismaelvillanuevamiranda/Colo_Rectal_DatasetQA_finetune_data | ---
license: mit
---
|
yzhuang/autotree_automl_electricity_sgosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 174960000
num_examples: 10000
- name: validation
num_bytes: 174960000
num_examples: 10000
download_size: 101429747
dataset_size: 349920000
---
# Dataset Card for "autotree_automl_electricity_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
micsell/hebrew_kan_sentence30000 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: language
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1842548455.0
num_examples: 10000
download_size: 1841787410
dataset_size: 1842548455.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigscience-data/roots_indic-hi_mkb | ---
language: hi
license: cc-by-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-hi_mkb
# mkb
- Dataset uid: `mkb`
### Description
The Prime Ministers speeches - Mann Ki Baat, on All India Radio, translated into many languages.
### Homepage
- https://huggingface.co/datasets/mkb
- http://preon.iiit.ac.in/~jerin/bhasha/
### Licensing
### Speaker Locations
### Sizes
- 0.0009 % of total
- 0.0174 % of indic-ta
- 0.0252 % of indic-ml
- 0.0416 % of indic-mr
- 0.0601 % of indic-gu
- 0.0047 % of indic-bn
- 0.0040 % of indic-hi
- 0.0185 % of indic-te
- 0.0162 % of indic-or
- 0.0026 % of indic-ur
### BigScience processing steps
#### Filters applied to: indic-ta
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: indic-ur
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
ruanchaves/hashset_manual | ---
annotations_creators:
- expert-generated
language_creators:
- machine-generated
language:
- hi
- en
license:
- unknown
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- structure-prediction
task_ids:
- named-entity-recognition
pretty_name: HashSet Manual
tags:
- word-segmentation
---
# Dataset Card for HashSet Manual
## Dataset Description
- **Repository:** [prashantkodali/HashSet](https://github.com/prashantkodali/HashSet)
- **Paper:** [HashSet -- A Dataset For Hashtag Segmentation](https://arxiv.org/abs/2201.06741)
### Dataset Summary
Hashset is a new dataset consisting on 1.9k manually annotated and 3.3M loosely supervised tweets for testing the
efficiency of hashtag segmentation models. We compare State of The Art Hashtag Segmentation models on Hashset and other
baseline datasets (STAN and BOUN). We compare and analyse the results across the datasets to argue that HashSet can act
as a good benchmark for hashtag segmentation tasks.
HashSet Manual: contains 1.9k manually annotated hashtags. Each row consists of the hashtag, segmented hashtag ,named entity annotations, whether the hashtag contains mix of hindi and english tokens and/or contains non-english tokens.
### Languages
Mostly Hindi and English.
## Dataset Structure
### Data Instances
```
{
"index": 10,
"hashtag": "goodnewsmegan",
"segmentation": "good news megan",
"spans": {
"start": [
8
],
"end": [
13
],
"text": [
"megan"
]
},
"source": "roman",
"gold_position": null,
"mix": false,
"other": false,
"ner": true,
"annotator_id": 1,
"annotation_id": 2088,
"created_at": "2021-12-30 17:10:33.800607",
"updated_at": "2021-12-30 17:10:59.714840",
"lead_time": 3896.182,
"rank": {
"position": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10
],
"candidate": [
"goodnewsmegan",
"goodnewsmeg an",
"goodnews megan",
"goodnewsmega n",
"go odnewsmegan",
"good news megan",
"good newsmegan",
"g oodnewsmegan",
"goodnewsme gan",
"goodnewsm egan"
]
}
}
```
### Data Fields
- `index`: a numerical index annotated by Kodali et al..
- `hashtag`: the original hashtag.
- `segmentation`: the gold segmentation for the hashtag.
- `spans`: named entity spans.
- `source`: data source.
- `gold_position`: position of the gold segmentation on the `segmentation` field inside the `rank`.
- `mix`: The hashtag has a mix of English and Hindi tokens.
- `other`: The hashtag has non-English tokens.
- `ner`: The hashtag has named entities.
- `annotator_id`: annotator ID.
- `annotation_id`: annotation ID.
- `created_at`: Creation date timestamp.
- `updated_at`: Update date timestamp.
- `lead_time`: Lead time field annotated by Kodali et al..
- `rank`: Rank of each candidate selected by a baseline word segmenter ( WordBreaker ).
- `candidates`: Candidates selected by a baseline word segmenter ( WordBreaker ).
## Dataset Creation
- All hashtag segmentation and identifier splitting datasets on this profile have the same basic fields: `hashtag` and `segmentation` or `identifier` and `segmentation`.
- The only difference between `hashtag` and `segmentation` or between `identifier` and `segmentation` are the whitespace characters. Spell checking, expanding abbreviations or correcting characters to uppercase go into other fields.
- There is always whitespace between an alphanumeric character and a sequence of any special characters ( such as `_` , `:`, `~` ).
- If there are any annotations for named entity recognition and other token classification tasks, they are given in a `spans` field.
## Additional Information
### Citation Information
```
@article{kodali2022hashset,
title={HashSet--A Dataset For Hashtag Segmentation},
author={Kodali, Prashant and Bhatnagar, Akshala and Ahuja, Naman and Shrivastava, Manish and Kumaraguru, Ponnurangam},
journal={arXiv preprint arXiv:2201.06741},
year={2022}
}
```
### Contributions
This dataset was added by [@ruanchaves](https://github.com/ruanchaves) while developing the [hashformers](https://github.com/ruanchaves/hashformers) library. |
AI4DS/bird_nl_to_sql_node | ---
license: apache-2.0
---
|
Vertex-Test/WeaponTest | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1 | ---
pretty_name: Evaluation run of NeuralNovel/Ember-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeuralNovel/Ember-7B-v0.1](https://huggingface.co/NeuralNovel/Ember-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T04:54:30.326660](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1/blob/main/results_2024-01-21T04-54-30.326660.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6350177574762061,\n\
\ \"acc_stderr\": 0.03209396150318776,\n \"acc_norm\": 0.6453628417891409,\n\
\ \"acc_norm_stderr\": 0.032876548477357756,\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6328773143242248,\n\
\ \"mc2_stderr\": 0.015426240628860234\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6569965870307167,\n \"acc_stderr\": 0.01387242322371817,\n\
\ \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n\
\ \"acc_stderr\": 0.004685334844038663,\n \"acc_norm\": 0.8552081258713403,\n\
\ \"acc_norm_stderr\": 0.003511717085451996\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947409,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947409\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.016376966142610076,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.016376966142610076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"\
acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949834,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949834\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6328773143242248,\n\
\ \"mc2_stderr\": 0.015426240628860234\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918747\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \
\ \"acc_stderr\": 0.0058298983559371955\n }\n}\n```"
repo_url: https://huggingface.co/NeuralNovel/Ember-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T04-54-30.326660.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- '**/details_harness|winogrande|5_2024-01-21T04-54-30.326660.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T04-54-30.326660.parquet'
- config_name: results
data_files:
- split: 2024_01_21T04_54_30.326660
path:
- results_2024-01-21T04-54-30.326660.parquet
- split: latest
path:
- results_2024-01-21T04-54-30.326660.parquet
---
# Dataset Card for Evaluation run of NeuralNovel/Ember-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Ember-7B-v0.1](https://huggingface.co/NeuralNovel/Ember-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T04:54:30.326660](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1/blob/main/results_2024-01-21T04-54-30.326660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6350177574762061,
"acc_stderr": 0.03209396150318776,
"acc_norm": 0.6453628417891409,
"acc_norm_stderr": 0.032876548477357756,
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6328773143242248,
"mc2_stderr": 0.015426240628860234
},
"harness|arc:challenge|25": {
"acc": 0.6569965870307167,
"acc_stderr": 0.01387242322371817,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815293
},
"harness|hellaswag|10": {
"acc": 0.6719776936865166,
"acc_stderr": 0.004685334844038663,
"acc_norm": 0.8552081258713403,
"acc_norm_stderr": 0.003511717085451996
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947409,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610076,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949834,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6328773143242248,
"mc2_stderr": 0.015426240628860234
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918747
},
"harness|gsm8k|5": {
"acc": 0.04700530705079606,
"acc_stderr": 0.0058298983559371955
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
isashap/resumetestwithlabels | ---
language:
- en
pretty_name: ai resume points
size_categories:
- n<1K
--- |
eminecg/petitions_29-ds | ---
dataset_info:
features:
- name: petition
dtype: string
- name: petition_length
dtype: int64
splits:
- name: train
num_bytes: 30457698.3
num_examples: 2475
- name: validation
num_bytes: 3384188.7
num_examples: 275
download_size: 15645193
dataset_size: 33841887.0
---
# Dataset Card for "petitions_29-ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rijgersberg/no_robots_nl | ---
configs:
- config_name: default
data_files:
- split: test_sft
path: data/test_sft-*
- split: train_sft
path: data/train_sft-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: messages_nl
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: test_sft
num_bytes: 1517416
num_examples: 500
- name: train_sft
num_bytes: 28407005
num_examples: 9500
download_size: 18675565
dataset_size: 29924421
license: cc-by-nc-4.0
language:
- nl
- en
tags:
- GEITje
task_categories:
- conversational
- text-generation
size_categories:
- 10K<n<100K
pretty_name: No Robots NL
---
# Dataset Card for "no_robots_nl"
A translated version of all 10k examples from [HuggingFaceH4/no_robots](https://huggingface.co/datasets/HuggingFaceH4/no_robots).
Automatically translated by GPT-3.5.
## More info
Read more about GEITje-chat, the datasets and the translation code in the [📄 README](https://github.com/Rijgersberg/GEITje/blob/main/README-en.md) on GitHub. |
nampdn-ai/tiny-strange-textbooks | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
pretty_name: Tiny Strange Textbooks
size_categories:
- 1M<n<10M
tags:
- synthetic
---
# Quirky Textbook Trove: Compact Excellence for Small Language Model
Strange dataset is 100% AI-generated, a compilation aligned with the vision of the [Textbooks Are All You Need](https://arxiv.org/abs/2306.11644) and [Textbooks Are All You Need II: phi-1.5 technical report](https://arxiv.org/abs/2309.05463) research. This dataset features 2,7M synthetic textbooks, encapsulating 16GB of raw text data. The unique name reflects its unconventional synthesis methodology, its compact size, deduped, and its emphasis on clear, focused content.
The dataset comprises text documents, each representing a tiny synthetic textbook. The source of this data is advanced open LLM-generated text, ensuring a high-quality, structured representation across a diverse range of subjects.
## Motivation
The creation of the dataset is driven by the need for high-quality, efficient training data. By emulating the principles outlined in the paper, this dataset aims to contribute to the development of more efficient language models that can achieve remarkable performance with less data.
## Usage
Researchers and AI practitioners can leverage this dataset for experiments in language model training, particularly those focused on the efficiency and efficacy of models trained on structured, high-quality data.
### Text Length Distribution
The textbooks in this dataset exhibit the following characteristics in terms of text length (measured in characters):
- **Mean**: 6,456.23
- **Standard Deviation**: 2,559.61
- **25th Percentile**: 4,831
- **Median (50th Percentile)**: 6,265
- **75th Percentile**: 8,048
These statistics indicate a varied range of text lengths, providing a comprehensive dataset suitable for diverse applications in language model training.
## Contribution
Contributions to the dataset are encouraged and valued. Enhancements can range from adding new textbooks to optimizing existing content for better quality and diversity.
## Acknowledgments
The development of this dataset was inspired by the groundbreaking work presented in the paper. I acknowledge the contribution of all the community members and the original authors (Microsoft Research) who have influenced this project.
### Disclaimer
While every effort has been made to ensure the accuracy of the information contained within this dataset, please note that it is provided 'as is' and without any warranties.
The use of the data is intended for research purposes only. You are advised to verify any information obtained from this dataset before acting upon it.
## Tiny Series
Explore the possibilities and limitations of building Small Language Models with these tiny gems of data!
- [TinyStories](https://arxiv.org/abs/2305.07759): The paper that sparked my interest in the journey of the tiny-* series.
- [tiny-codes](https://huggingface.co/datasets/nampdn-ai/tiny-codes): Collection of 1.6M short and clear code snippets that can help LLM models learn how to reason.
- [tiny-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-textbooks): 420k "things of internet" synthetic textbooks.
- [tiny-code-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-code-textbooks): Collection of 207k code explanation synthetic textbooks.
- [tiny-math-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-math-textbooks): Collection of 635k short math textbook on various mathematical topics.
- [tiny-orca-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-orca-textbooks): Synthetic textbook to help model learn in-context on how it should perform task the right way.
- [tiny-webtext](https://huggingface.co/datasets/nampdn-ai/tiny-webtext): A 6GB (4.5M records) variety of diverse webtext enriched with critical thinking methods to make unbiased English dataset.
- [tiny-lessons](https://huggingface.co/datasets/nampdn-ai/tiny-lessons): Subset of tiny-textbooks dataset, various lessons about "things of internet" augmented in a bite-sized textbook Markdown format.
- [tiny-bridgedict](https://huggingface.co/datasets/nampdn-ai/tiny-bridgedict): A dataset that links and transfers knowledge between English, Vietnamese, Chinese in a tiny multilingual models.
## Citation
```
@misc {nam_pham_2024,
author = { {Nam Pham} },
title = { tiny-strange-textbooks (Revision 6f304f1) },
year = 2024,
url = { https://huggingface.co/datasets/nampdn-ai/tiny-strange-textbooks },
doi = { 10.57967/hf/1612 },
publisher = { Hugging Face }
}
``` |
gangkongkong/koalpaca-llama2 | ---
license: apache-2.0
---
|
version-control/data-0 | ---
dataset_info:
features:
- name: version
dtype: string
- name: code
dtype: string
- name: apis
sequence: string
- name: full_version
dtype: string
- name: repo_name
dtype: string
- name: hexsha
dtype: string
splits:
- name: torch
num_bytes: 10230380
num_examples: 845
- name: tensorflow
num_bytes: 4253206
num_examples: 352
- name: scipy
num_bytes: 5731861
num_examples: 247
- name: pandas
num_bytes: 9106228
num_examples: 572
- name: sklearn
num_bytes: 2401451
num_examples: 202
- name: numpy
num_bytes: 42324032
num_examples: 2493
- name: matplotlib
num_bytes: 3539796
num_examples: 274
download_size: 27256602
dataset_size: 77586954
configs:
- config_name: default
data_files:
- split: torch
path: data/torch-*
- split: tensorflow
path: data/tensorflow-*
- split: scipy
path: data/scipy-*
- split: pandas
path: data/pandas-*
- split: sklearn
path: data/sklearn-*
- split: numpy
path: data/numpy-*
- split: matplotlib
path: data/matplotlib-*
---
|
chanwoopark/kowikitext | ---
license: cc-by-sa-3.0
---
|
AdapterOcean/cortex_ | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: responses
dtype: string
splits:
- name: train
num_bytes: 806527
num_examples: 400
download_size: 446479
dataset_size: 806527
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/nagae_iku_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nagae_iku/永江衣玖/나가에이쿠 (Touhou)
This is the dataset of nagae_iku/永江衣玖/나가에이쿠 (Touhou), containing 500 images and their tags.
The core tags of this character are `short_hair, hat, red_eyes, purple_hair, ribbon, bow, hat_ribbon, blue_hair, hat_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 599.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 390.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1079 | 734.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 553.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1079 | 959.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nagae_iku_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, capelet, frills, shawl, smile, solo |
| 1 | 25 |  |  |  |  |  | 1girl, frills, shawl, solo, capelet, electricity, skirt, smile |
| 2 | 33 |  |  |  |  |  | 1girl, black_headwear, solo, white_shirt, hagoromo, long_sleeves, black_skirt, looking_at_viewer, smile, bangs, frilled_capelet, red_ribbon, closed_mouth, red_bow, white_capelet, blush, ascot, hair_between_eyes |
| 3 | 5 |  |  |  |  |  | 1girl, blush, large_breasts, solo, looking_at_viewer, nipples, shawl, upper_body, capelet, open_mouth, shirt |
| 4 | 6 |  |  |  |  |  | 2girls, peach, shawl, frills, long_hair |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, penis, bangs, large_breasts, paizuri, fellatio, frills, heart, huge_breasts, nipples, nude, pov, upper_body, bar_censor, black_headwear, cum_on_breasts, looking_at_viewer, simple_background, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | capelet | frills | shawl | smile | solo | electricity | skirt | black_headwear | white_shirt | hagoromo | long_sleeves | black_skirt | looking_at_viewer | bangs | frilled_capelet | red_ribbon | closed_mouth | red_bow | white_capelet | blush | ascot | hair_between_eyes | large_breasts | nipples | upper_body | open_mouth | shirt | 2girls | peach | long_hair | 1boy | hetero | solo_focus | penis | paizuri | fellatio | heart | huge_breasts | nude | pov | bar_censor | cum_on_breasts | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:---------|:--------|:--------|:-------|:--------------|:--------|:-----------------|:--------------|:-----------|:---------------|:--------------|:--------------------|:--------|:------------------|:-------------|:---------------|:----------|:----------------|:--------|:--------|:--------------------|:----------------|:----------|:-------------|:-------------|:--------|:---------|:--------|:------------|:-------|:---------|:-------------|:--------|:----------|:-----------|:--------|:---------------|:-------|:------|:-------------|:-----------------|:--------------------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 33 |  |  |  |  |  | X | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | X | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | | X | | | | X | | | | | X | X | | | | | | X | | | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
eliphatfs/ObjaversePoints-700K | ---
license: odc-by
language:
- en
size_categories:
- 100K<n<1M
---
# Dataset Card for ObjaversePoints-700K
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Contains high-quality point clouds and captions generated from the Objaverse collection.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nmitchko/i2b2-query-data-1.0 | ---
license: mit
task_categories:
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# i2b2 query data 1.0
This is a dataset of [i2b2](https://www.i2b2.org/) query builder [examples](https://community.i2b2.org/wiki/display/webclient/3.+Query+Tool) that are taken from a test environment of i2b2 and then pre-processed with AI descriptions. |
wanadzhar913/crawl-leaazleeya | ---
license: apache-2.0
language:
- ms
---
# TLDR
* Website: [leaazleeya](https://www.leaazleeya.com/)
* Num. of webpages: 543
* Num. of webpages scraped: 543
* Num. articles successfully extracted: 534
* Remaing webpages to be scraped: 0
* Scraped on: 5th August 2023
* Text data language: Bahasa Melayu (informal)
* Contributed to: https://github.com/huseinzol05/malaysian-dataset
* Pull request: https://github.com/huseinzol05/malaysian-dataset/pull/245 |
Jonglee/disinfectants_general_qa | ---
dataset_info:
features:
- name: Questions
dtype: string
- name: Answers
dtype: string
splits:
- name: train
num_bytes: 44677
num_examples: 100
download_size: 27086
dataset_size: 44677
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "disinfectants_general_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kristen_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kristen Wright (Arknights)
This is the dataset of Kristen Wright (Arknights), containing 85 images and their tags.
The core tags of this character are `long_hair, animal_ears, blonde_hair, dog_ears, dog_girl, hairband, blue_eyes, black_hairband, floppy_ears, breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 85 | 144.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kristen_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 85 | 118.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kristen_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 197 | 228.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kristen_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kristen_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, long_sleeves, solo, simple_background, looking_at_viewer, brown_jacket, closed_mouth, coat, smile, white_background, pants, sketch, upper_body, black_shirt, open_jacket |
| 1 | 6 |  |  |  |  |  | 1girl, cleavage, dog_tail, looking_at_viewer, simple_background, solo, white_background, off_shoulder, alternate_costume, long_sleeves, medium_breasts, black_collar, cowboy_shot, dress, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | simple_background | looking_at_viewer | brown_jacket | closed_mouth | coat | smile | white_background | pants | sketch | upper_body | black_shirt | open_jacket | cleavage | dog_tail | off_shoulder | alternate_costume | medium_breasts | black_collar | cowboy_shot | dress | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:--------------------|:--------------------|:---------------|:---------------|:-------|:--------|:-------------------|:--------|:---------|:-------------|:--------------|:--------------|:-----------|:-----------|:---------------|:--------------------|:-----------------|:---------------|:--------------|:--------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | X |
|
JonaszPotoniec/dowcipy-polish-jokes-dataset | ---
dataset_info:
features:
- name: joke
dtype: string
- name: upvotes
dtype: int64
- name: downvotes
dtype: int64
splits:
- name: train
num_bytes: 3074127
num_examples: 9020
download_size: 2061760
dataset_size: 3074127
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
language:
- pl
pretty_name: Dowcipy jaja
tags:
- art
size_categories:
- 1K<n<10K
---
# Dataset consisting of polish jokes
## Warning: Jokes were not curated, some may be offensive, stupid or simply not funny. It's highly recommended to filter jokes before training, e.g., based on downvotes
This dataset consists of all (9k) jokes dumped from [jeja.pl](https://dowcipy.jeja.pl/) on 2024-02-14. Jokes are submitted by the community. Besides _the funny_ text itself, I included upvotes and downvotes. You can use them for filtering.
Default sorting is based on a combination of downvotes and upvotes.
If used for training LLMs, it's recommended to use a tokenizer that supports line breaks, as these are often important for readability of the jokes.
## Where to find me
- [Github](https://github.com/JonaszPotoniec)
- [Linkedin](https://www.linkedin.com/in/jonasz-potoniec/)
- [E-mail](mailto:jonasz@potoniec.eu)
- [Telegram](https://t.me/JonaszPotoniec) |
KotiyaSanae/myhime | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Myhime
This is the image base of bangumi myhime, we detected 72 characters, 4631 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 26 | [Download](0\dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 14 | [Download](1\dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 29 | [Download](2\dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 393 | [Download](3\dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 39 | [Download](4\dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 29 | [Download](5\dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 28 | [Download](6\dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 45 | [Download](7\dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 409 | [Download](8\dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 95 | [Download](9\dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 169 | [Download](10\dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 49 | [Download](11\dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 36 | [Download](12\dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 47 | [Download](13\dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 90 | [Download](14\dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 30 | [Download](15\dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 40 | [Download](16\dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 236 | [Download](17\dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 199 | [Download](18\dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 30 | [Download](19\dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 17 | [Download](20\dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 15 | [Download](21\dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 17 | [Download](22\dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 130 | [Download](23\dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 22 | [Download](24\dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 24 | [Download](25\dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 19 | [Download](26\dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 103 | [Download](27\dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 8 | [Download](28\dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 133 | [Download](29\dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 138 | [Download](30\dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 17 | [Download](31\dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 21 | [Download](32\dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 22 | [Download](33\dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 214 | [Download](34\dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 61 | [Download](35\dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 35 | [Download](36\dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 11 | [Download](37\dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 17 | [Download](38\dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 69 | [Download](39\dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 54 | [Download](40\dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 37 | [Download](41\dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 9 | [Download](42\dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 106 | [Download](43\dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 12 | [Download](44\dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 195 | [Download](45\dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 31 | [Download](46\dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 88 | [Download](47\dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 19 | [Download](48\dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 23 | [Download](49\dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 11 | [Download](50\dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 10 | [Download](51\dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 47 | [Download](52\dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 9 | [Download](53\dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 8 | [Download](54\dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 7 | [Download](55\dataset.zip) |  |  |  |  |  |  |  | N/A |
| 56 | 17 | [Download](56\dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 7 | [Download](57\dataset.zip) |  |  |  |  |  |  |  | N/A |
| 58 | 89 | [Download](58\dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 39 | [Download](59\dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 32 | [Download](60\dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 8 | [Download](61\dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 226 | [Download](62\dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 9 | [Download](63\dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 45 | [Download](64\dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 39 | [Download](65\dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 11 | [Download](66\dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 7 | [Download](67\dataset.zip) |  |  |  |  |  |  |  | N/A |
| 68 | 13 | [Download](68\dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 16 | [Download](69\dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 6 | [Download](70\dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 275 | [Download](-1\dataset.zip) |  |  |  |  |  |  |  |  |
|
realjackiexiao/tts-frontend-dataset | ---
license: mit
---
See: https://github.com/Jackiexiao/tts-frontend-dataset |
sahanruwantha/alpaca-sinhala | ---
license: mit
task_categories:
- question-answering
- translation
language:
- si
size_categories:
- 10K<n<100K
tags:
- sinhala
- alpaca-dataset
- translation
- nlp
description: |
The Alpaca dataset translated into Sinhala using Google Translator. Manual verification and correction of translations are recommended for optimal performance.
---
# Sinhala Translated Alpaca Dataset
## Overview
This dataset is a Sinhala translation of the original Alpaca dataset, accomplished using Google Translator. It serves as a resource for the Sinhala language and is intended for various natural language processing (NLP) tasks.
## Dataset Information
- **Name**: Sinhala Translated Alpaca Dataset
- **Format**: Text
- **Task Categories**: Translation
- **Languages**: Sinhala
- **Dataset Size**: 57 mb
## Organization and Authors
- **Organization**: Your Organization Name
- **Authors**: Your Name
## Description
The dataset consists of Sinhala translations of the original Alpaca dataset. It was created using Google Translator. It is important to note that machine translations may not be perfect, and manual verification and correction of translations are recommended for optimal performance.
## Usage
To access and use the dataset, you can visit the [dataset page on Hugging Face](https://www.huggingface.co/your-username/sinhala-alpaca-dataset).
## Acknowledgments
- The original Alpaca dataset creators
- Google Translator for the initial translation process
Feel free to use and contribute to this dataset for enhancing Sinhala language models.
Happy exploring!
|
ovior/twitter_dataset_1713218203 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2766772
num_examples: 8587
download_size: 1558119
dataset_size: 2766772
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rmcpantoja/Ald_Mexican_Spanish_speech_dataset | ---
license: unlicense
task_categories:
- token-classification
language:
- es
---
This dataset can be used to fine-tune Speech To Text models as Text To Speech.
## dataset information
* Speaker: Aldo
* Dataset size: 535 audio files
* audio duration of 4-15 seconds (1:33:15)
## Dataset structure
This dataset has been structured in the LJSpeech format:
* wavs/
* 1.wav
* 2.wav
* 3.wav
* ---
* 535.wav
* transcript.csv |
phamtungthuy/vuongmacphaply | ---
dataset_info:
features:
- name: content
dtype: string
- name: question
dtype: string
- name: relevant_laws
list:
- name: law_id
dtype: string
- name: text
dtype: string
- name: split
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 96312457
num_examples: 57354
download_size: 38829771
dataset_size: 96312457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pseudohappy/khanh_ly | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: segment_text
sequence: string
- name: segment_align
sequence:
sequence: int64
- name: sid
dtype: string
splits:
- name: train
num_bytes: 21454602.0
num_examples: 23
download_size: 21210279
dataset_size: 21454602.0
---
# Dataset Card for "khanh_ly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/reisa_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of reisa/宇沢レイサ/玲纱 (Blue Archive)
This is the dataset of reisa/宇沢レイサ/玲纱 (Blue Archive), containing 461 images and their tags.
The core tags of this character are `long_hair, multicolored_hair, blue_hair, pink_hair, hair_ornament, light_blue_hair, streaked_hair, twintails, star_hair_ornament, halo, ahoge, two-tone_hair, purple_eyes, low_twintails, star_halo, pink_halo, hair_scrunchie, hair_between_eyes, scrunchie`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 461 | 646.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisa_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 461 | 549.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisa_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1156 | 1.12 GiB | [Download](https://huggingface.co/datasets/CyberHarem/reisa_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/reisa_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 38 |  |  |  |  |  | 1girl, black_neckerchief, grey_serafuku, solo, star_(symbol), white_sailor_collar, long_sleeves, grey_shirt, grey_skirt, pleated_skirt, looking_at_viewer, open_mouth, open_jacket, blush, simple_background, white_background, :d, black_jacket, backpack, navel, midriff |
| 1 | 15 |  |  |  |  |  | 1girl, grey_skirt, long_sleeves, open_jacket, open_mouth, pleated_skirt, solo, star_(symbol), striped_clothes, striped_socks, white_sailor_collar, backpack, black_neckerchief, grey_shirt, pink_footwear, grey_serafuku, looking_at_viewer, pink_socks, black_jacket, simple_background, :d, blush, full_body, hair_beads, pink_bag, white_background, sneakers, gun, standing |
| 2 | 6 |  |  |  |  |  | black_jacket, black_neckerchief, grey_skirt, long_sleeves, open_mouth, pleated_skirt, solo_focus, star_(symbol), white_sailor_collar, 2girls, grey_serafuku, hair_beads, open_jacket, smile, grey_shirt, blush, closed_eyes, simple_background, white_background |
| 3 | 9 |  |  |  |  |  | star_(symbol), open_mouth, :d, black_dress, blush, enmaided, looking_at_viewer, maid_apron, maid_headdress, white_background, 1girl, simple_background, solo, frills, puffy_sleeves, short_sleeves, full_body, purple_hair, shoes, white_apron |
| 4 | 6 |  |  |  |  |  | 1girl, completely_nude, hair_beads, nipples, open_mouth, pussy, star_(symbol), 1boy, blush, hetero, loli, looking_at_viewer, navel, solo_focus, uncensored, breasts, flat_chest, simple_background, smile, backpack, cleft_of_venus, collarbone, cum, on_back, penis, pink_bag, spread_legs |
| 5 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, open_mouth, simple_background, solo, star_(symbol), full_body, toes, :d, collarbone, purple_hair, small_breasts, stomach, white_background, bare_legs, bare_shoulders, barefoot, hair_beads, holding, micro_bikini, sandals, side-tie_bikini_bottom, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_neckerchief | grey_serafuku | solo | star_(symbol) | white_sailor_collar | long_sleeves | grey_shirt | grey_skirt | pleated_skirt | looking_at_viewer | open_mouth | open_jacket | blush | simple_background | white_background | :d | black_jacket | backpack | navel | midriff | striped_clothes | striped_socks | pink_footwear | pink_socks | full_body | hair_beads | pink_bag | sneakers | gun | standing | solo_focus | 2girls | smile | closed_eyes | black_dress | enmaided | maid_apron | maid_headdress | frills | puffy_sleeves | short_sleeves | purple_hair | shoes | white_apron | completely_nude | nipples | pussy | 1boy | hetero | loli | uncensored | breasts | flat_chest | cleft_of_venus | collarbone | cum | on_back | penis | spread_legs | toes | small_breasts | stomach | bare_legs | bare_shoulders | barefoot | holding | micro_bikini | sandals | side-tie_bikini_bottom |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------------|:-------|:----------------|:----------------------|:---------------|:-------------|:-------------|:----------------|:--------------------|:-------------|:--------------|:--------|:--------------------|:-------------------|:-----|:---------------|:-----------|:--------|:----------|:------------------|:----------------|:----------------|:-------------|:------------|:-------------|:-----------|:-----------|:------|:-----------|:-------------|:---------|:--------|:--------------|:--------------|:-----------|:-------------|:-----------------|:---------|:----------------|:----------------|:--------------|:--------|:--------------|:------------------|:----------|:--------|:-------|:---------|:-------|:-------------|:----------|:-------------|:-----------------|:-------------|:------|:----------|:--------|:--------------|:-------|:----------------|:----------|:------------|:-----------------|:-----------|:----------|:---------------|:----------|:-------------------------|
| 0 | 38 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | | X | X | | X | X | X | X | X | X | | X | X | X | X | X | | X | | | | | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | X | X | | | | | | X | X | | X | X | X | X | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | | X | | | | | | X | X | | X | X | | | | X | X | | | | | | | X | X | | | | X | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | X | | | | | | X | X | | X | X | X | X | | | X | | | | | | X | X | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X |
|
griffin/progressive_summarization_v2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: prompt
dtype: string
- name: completion
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 21648220
num_examples: 4524
- name: eval
num_bytes: 7510493
num_examples: 1548
download_size: 5732033
dataset_size: 29158713
---
# Dataset Card for "progressive_summarization_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/scottishdirectories_loaded | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: string
- name: iiif_manifest
dtype: string
- name: loaded_image
dtype: image
splits:
- name: train
num_bytes: 178900224.0
num_examples: 10000
download_size: 0
dataset_size: 178900224.0
---
# Dataset Card for "scottishdirectories_loaded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cahya/instructions-id | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 35749284.66851785
num_examples: 85242
- name: test
num_bytes: 1986211.1657410732
num_examples: 4736
- name: validation
num_bytes: 1986211.1657410732
num_examples: 4736
download_size: 21158281
dataset_size: 39721706.99999999
---
# Dataset Card for "instructions-id"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/v3_train_free_concat_43 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842643416
num_examples: 2500
download_size: 1850031255
dataset_size: 3842643416
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_sent_before_sent_train_200_eval_40_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 2329316
num_examples: 1263
- name: validation
num_bytes: 398956
num_examples: 203
download_size: 533740
dataset_size: 2728272
---
# Dataset Card for "find_sent_before_sent_train_200_eval_40_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonasantos5240/leon | ---
license: openrail
---
|
tuttistudio/NinaGrandma | ---
license: other
license_name: public
license_link: LICENSE
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.