datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
panes/demo | ---
license: bsd
---
|
heliosprime/twitter_dataset_1713157497 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8624
num_examples: 24
download_size: 12453
dataset_size: 8624
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713157497"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_context_v3_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 407101
num_examples: 184
- name: validation
num_bytes: 103870
num_examples: 68
download_size: 0
dataset_size: 510971
---
# Dataset Card for "squad_context_v3_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia | ---
pretty_name: Evaluation run of LTC-AI-Labs/L2-7b-Hermes-Synthia
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LTC-AI-Labs/L2-7b-Hermes-Synthia](https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-07T21:05:32.577722](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia/blob/main/results_2023-12-07T21-05-32.577722.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4795053047509253,\n\
\ \"acc_stderr\": 0.03443814682805137,\n \"acc_norm\": 0.4845931809792794,\n\
\ \"acc_norm_stderr\": 0.03522205048499402,\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.46774826109346956,\n\
\ \"mc2_stderr\": 0.01487542286406713\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.01460013207594709,\n\
\ \"acc_norm\": 0.5102389078498294,\n \"acc_norm_stderr\": 0.014608326906285012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5952997410874328,\n\
\ \"acc_stderr\": 0.0048983081672118465,\n \"acc_norm\": 0.7911770563632743,\n\
\ \"acc_norm_stderr\": 0.0040563690969549464\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981748,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981748\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537313,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537313\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.02351729433596328,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02351729433596328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.0416345303130286,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.0416345303130286\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n\
\ \"acc_stderr\": 0.028396016402761,\n \"acc_norm\": 0.5290322580645161,\n\
\ \"acc_norm_stderr\": 0.028396016402761\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051451,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051451\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.601010101010101,\n\
\ \"acc_stderr\": 0.03488901616852731,\n \"acc_norm\": 0.601010101010101,\n\
\ \"acc_norm_stderr\": 0.03488901616852731\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017838,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017838\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6587155963302752,\n \"acc_stderr\": 0.020328612816592442,\n \"\
acc_norm\": 0.6587155963302752,\n \"acc_norm_stderr\": 0.020328612816592442\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5980392156862745,\n \"acc_stderr\": 0.03441190023482464,\n \"\
acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.03441190023482464\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \
\ \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6513409961685823,\n\
\ \"acc_stderr\": 0.01704124314349096,\n \"acc_norm\": 0.6513409961685823,\n\
\ \"acc_norm_stderr\": 0.01704124314349096\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.026882643434022885,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.026882643434022885\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n\
\ \"acc_stderr\": 0.028196400574197422,\n \"acc_norm\": 0.5594855305466238,\n\
\ \"acc_norm_stderr\": 0.028196400574197422\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542602,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3520208604954368,\n\
\ \"acc_stderr\": 0.012198140605353599,\n \"acc_norm\": 0.3520208604954368,\n\
\ \"acc_norm_stderr\": 0.012198140605353599\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46078431372549017,\n \"acc_stderr\": 0.020165523313907904,\n \
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.020165523313907904\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.46774826109346956,\n\
\ \"mc2_stderr\": 0.01487542286406713\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.01224880696937642\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13949962092494314,\n \
\ \"acc_stderr\": 0.00954342668719131\n }\n}\n```"
repo_url: https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|arc:challenge|25_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|gsm8k|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hellaswag|10_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-07T21-05-32.577722.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- '**/details_harness|winogrande|5_2023-12-07T21-05-32.577722.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-07T21-05-32.577722.parquet'
- config_name: results
data_files:
- split: 2023_12_07T21_05_32.577722
path:
- results_2023-12-07T21-05-32.577722.parquet
- split: latest
path:
- results_2023-12-07T21-05-32.577722.parquet
---
# Dataset Card for Evaluation run of LTC-AI-Labs/L2-7b-Hermes-Synthia
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LTC-AI-Labs/L2-7b-Hermes-Synthia](https://huggingface.co/LTC-AI-Labs/L2-7b-Hermes-Synthia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-07T21:05:32.577722](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__L2-7b-Hermes-Synthia/blob/main/results_2023-12-07T21-05-32.577722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4795053047509253,
"acc_stderr": 0.03443814682805137,
"acc_norm": 0.4845931809792794,
"acc_norm_stderr": 0.03522205048499402,
"mc1": 0.32068543451652387,
"mc1_stderr": 0.0163391703732809,
"mc2": 0.46774826109346956,
"mc2_stderr": 0.01487542286406713
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.01460013207594709,
"acc_norm": 0.5102389078498294,
"acc_norm_stderr": 0.014608326906285012
},
"harness|hellaswag|10": {
"acc": 0.5952997410874328,
"acc_stderr": 0.0048983081672118465,
"acc_norm": 0.7911770563632743,
"acc_norm_stderr": 0.0040563690969549464
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537313,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537313
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02351729433596328,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02351729433596328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.0416345303130286,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.0416345303130286
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5290322580645161,
"acc_stderr": 0.028396016402761,
"acc_norm": 0.5290322580645161,
"acc_norm_stderr": 0.028396016402761
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051451,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051451
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852731,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852731
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017838,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6587155963302752,
"acc_stderr": 0.020328612816592442,
"acc_norm": 0.6587155963302752,
"acc_norm_stderr": 0.020328612816592442
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.03441190023482464,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.03441190023482464
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6513409961685823,
"acc_stderr": 0.01704124314349096,
"acc_norm": 0.6513409961685823,
"acc_norm_stderr": 0.01704124314349096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.026882643434022885,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.026882643434022885
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767867,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5594855305466238,
"acc_stderr": 0.028196400574197422,
"acc_norm": 0.5594855305466238,
"acc_norm_stderr": 0.028196400574197422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542602,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3520208604954368,
"acc_stderr": 0.012198140605353599,
"acc_norm": 0.3520208604954368,
"acc_norm_stderr": 0.012198140605353599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.020165523313907904,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.020165523313907904
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268813,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268813
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32068543451652387,
"mc1_stderr": 0.0163391703732809,
"mc2": 0.46774826109346956,
"mc2_stderr": 0.01487542286406713
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.01224880696937642
},
"harness|gsm8k|5": {
"acc": 0.13949962092494314,
"acc_stderr": 0.00954342668719131
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nexusflow/PlacesAPIBenchmark | ---
dataset_info:
features:
- name: Input
dtype: string
- name: Output
dtype: string
splits:
- name: train
num_bytes: 9247
num_examples: 48
download_size: 6295
dataset_size: 9247
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tfnn/HeadsNet | ---
license: mit
tags:
- thispersondoesnotexist
- stylegan
- stylegan2
- mesh
- model
- 3d
- asset
- generative
pretty_name: HeadsNet
size_categories:
- 1K<n<10K
---
# HeadsNet
The basic concept is to train a FNN/MLP on vertex data of 3D heads so that it can then re-produce random 3D heads.
This dataset uses the [thispersondoesnotexist_to_triposr_6748_3D_Heads](https://huggingface.co/datasets/tfnn/thispersondoesnotexist_to_triposr_6748_3D_Heads) dataset as a foundation.
The heads dataset was collecting using the scraper [Dataset_Scraper.7z](https://huggingface.co/datasets/tfnn/HeadsNet/resolve/main/Dataset_Scraper.7z?download=true) based on [TripoSR](https://github.com/VAST-AI-Research/TripoSR) which converts the 2D images from [ThisPersonDoesNotExist](https://thispersondoesnotexist.com/) into 3D meshes. _(using [this marching cubes improvement](https://github.com/VAST-AI-Research/TripoSR/issues/22#issuecomment-2010318709) by [thatname/zephyr](https://github.com/thatname))_
Vertex Normals need to be generated before we can work with this dataset, the easiest method to achieve this is with a simple [Blender](https://www.blender.org/) script:
```
import bpy
import glob
import pathlib
from os import mkdir
from os.path import isdir
importDir = "ply/"
outputDir = "ply_norm/"
if not isdir(outputDir): mkdir(outputDir)
for file in glob.glob(importDir + "*.ply"):
model_name = pathlib.Path(file).stem
if pathlib.Path(outputDir+model_name+'.ply').is_file() == True: continue
bpy.ops.wm.ply_import(filepath=file)
bpy.ops.wm.ply_export(
filepath=outputDir+model_name+'.ply',
filter_glob='*.ply',
check_existing=False,
ascii_format=False,
export_selected_objects=False,
apply_modifiers=True,
export_triangulated_mesh=True,
export_normals=True,
export_uv=False,
export_colors='SRGB',
global_scale=1.0,
forward_axis='Y',
up_axis='Z'
)
bpy.ops.object.select_all(action='SELECT')
bpy.ops.object.delete(use_global=False)
bpy.ops.outliner.orphans_purge()
bpy.ops.outliner.orphans_purge()
bpy.ops.outliner.orphans_purge()
```
_Importing the PLY without normals causes Blender to automatically generate them._
At this point the PLY files now need to be converted to training data, for this I wrote a C program [DatasetGen_2_6.7z](https://huggingface.co/datasets/tfnn/HeadsNet/resolve/main/DatasetGen_2_6.7z?download=true) using [RPLY](https://w3.impa.br/~diego/software/rply/) to load the PLY files and convert them to binary data which I have provided here [HeadsNet-2-6.7z](https://huggingface.co/datasets/tfnn/HeadsNet/resolve/main/HeadsNet-2-6.7z?download=true).
It's always good to [NaN](https://en.wikipedia.org/wiki/NaN) check your training data after generating it so I have provided a simple Python script for that here [nan_check.py](https://huggingface.co/datasets/tfnn/HeadsNet/resolve/main/nan_check.py?download=true).
This binary training data can be loaded into Python using [Numpy](https://numpy.org/):
```
load_x = []
with open("train_x.dat", 'rb') as f:
load_x = np.fromfile(f, dtype=np.float32)
load_y = []
with open("train_y.dat", 'rb') as f:
load_y = np.fromfile(f, dtype=np.float32)
```
The data can then be reshaped and saved back out as a numpy array which makes for faster loading:
```
inputsize = 2
outputsize = 6
training_samples = 632847695
train_x = np.reshape(load_x, [training_samples, inputsize])
train_y = np.reshape(load_y, [training_samples, outputsize])
np.save("train_x.npy", train_x)
np.save("train_y.npy", train_y)
```
_632,847,695 samples, each sample is 2 components for train_x (random seed & 0-1 unit sphere position index) and 6 components for train_y (vertex position [x,y,z] & vertex color [r,g,b])._
The basic premise of how this network is trained and thus how the dataset is generated in the C program is:
1. All models are pre-scaled to a normal cubic scale and then scaled again by 0.55 so that they all fit within a unit sphere.
2. All model vertices are reverse traced from the vertex position to the perimeter of the unit sphere using the vertex normal as the direction.
3. The nearest position on a 10,242 vertex icosphere is found and the network is trained to output the model vertex position and vertex color (6 components) at the index of the icosphere vertex.
4. The icosphere vertex index is scaled to a 0-1 range before being input to the network.
5. The network only has two input parameters, the other parameter is a 0-1 model ID which is randomly selected and all vertices for a specific model are trained into the network using the randomly selected ID. This ID does not change per-vertex it only changes per 3D model.
6. The ID allows the user to use this parameter as a sort of hyper-parameter for the random seed: to generate a random Head using this network you would input a random 0-1 seed and then iterate the icosphere index parameter to some sample range between 0-1 so if you wanted a 20,000 vertex head you would iterate between 0-1 at 20,000 increments of 0.00005 as the network outputs one vertex position and vertex color for each forward-pass.
* 1st input parameter = random seed
* 2nd input parameter = icosphere index
More about this type of network topology can be read here: https://gist.github.com/mrbid/1eacdd9d9239b2d324a3fa88591ff852
## Improvements
* Future networks will have 3 additional input parameters one for each x,y,z of a unit vector for the ray direction from the icosphere index.
* The unit vector used to train the network will just be the vertex normal from the 3D model but inverted.
* When performing inference more forward-passes would need to be performed as some density of rays in a 30° or similar cone angle pointing to 0,0,0 would need to be performed per icosphere index position.
* This could result in higher quality outputs; at the cost of an order of magnitude more forward-pass iterations.
## Updates
* A new dataset has been generated [HeadsNet-2-6_v2.7z](https://huggingface.co/datasets/tfnn/HeadsNet/resolve/main/HeadsNet-2-6_v2.7z?download=true), the old one uses a 10,242 vertex unit icosphere and the new one uses a 655,362 vertex unit icosphere, this should lead to a higher quality network. Start training with it instantly using [HeadsNet_v2_Trainer_with_Dataset.7z](https://huggingface.co/datasets/tfnn/HeadsNet/resolve/main/HeadsNet_v2_Trainer_with_Dataset.7z?download=true).
* The system didn't work out, here I have trained models of various qualities: [HeadsNet_Trained_Models.7z](https://huggingface.co/datasets/tfnn/HeadsNet/resolve/main/HeadsNet_Trained_Models.7z?download=true). The network has some potential, with a better refined dataset and better network topology it could prove more successful. |
thocheat/vlsp | ---
license: other
---
|
jaydenccc/AI_Storyteller_Dataset | ---
dataset_info:
features:
- name: synopsis
dtype: string
- name: short_story
dtype: string
splits:
- name: train
num_bytes: 204642
num_examples: 100
download_size: 129691
dataset_size: 204642
---
# Dataset Card for "AI_Storyteller_Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
halitefe/lima-tr | ---
task_categories:
- translation
- text2text-generation
- question-answering
language:
- tr
size_categories:
- 1K<n<10K
---
# LIMA-TR
This project is dedicated to translating the LIMA (Less Is More for Alignment) dataset from English to Turkish using OpenAI's API (`gpt-3.5-turbo`)
## Source Dataset
LIMA (Less Is More for Alignment) can be found [paper-link](https://arxiv.org/pdf/2305.11206.pdf) [dataset-link](https://huggingface.co/datasets/GAIR/lima) |
yuan-sf63/chenyu_label_0.5_64 | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
splits:
- name: train
num_bytes: 23408905.925216522
num_examples: 37825
- name: validation
num_bytes: 2601127.0747834775
num_examples: 4203
download_size: 0
dataset_size: 26010033.0
---
# Dataset Card for "chenyu_label_0.5_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored | ---
pretty_name: Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Tap-M/Luna-AI-Llama2-Uncensored](https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T11:27:26.030734](https://huggingface.co/datasets/open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored/blob/main/results_2023-10-13T11-27-26.030734.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19651845637583892,\n\
\ \"em_stderr\": 0.004069389648475156,\n \"f1\": 0.26367764261744986,\n\
\ \"f1_stderr\": 0.0041324301979436965,\n \"acc\": 0.413131375387228,\n\
\ \"acc_stderr\": 0.010360509171200127\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19651845637583892,\n \"em_stderr\": 0.004069389648475156,\n\
\ \"f1\": 0.26367764261744986,\n \"f1_stderr\": 0.0041324301979436965\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09855951478392722,\n \
\ \"acc_stderr\": 0.008210320350946319\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T23_46_48.107460
path:
- '**/details_harness|drop|3_2023-10-12T23-46-48.107460.parquet'
- split: 2023_10_13T11_27_26.030734
path:
- '**/details_harness|drop|3_2023-10-13T11-27-26.030734.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T11-27-26.030734.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T23_46_48.107460
path:
- '**/details_harness|gsm8k|5_2023-10-12T23-46-48.107460.parquet'
- split: 2023_10_13T11_27_26.030734
path:
- '**/details_harness|gsm8k|5_2023-10-13T11-27-26.030734.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T11-27-26.030734.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:10:16.061050.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:10:16.061050.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T23_46_48.107460
path:
- '**/details_harness|winogrande|5_2023-10-12T23-46-48.107460.parquet'
- split: 2023_10_13T11_27_26.030734
path:
- '**/details_harness|winogrande|5_2023-10-13T11-27-26.030734.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T11-27-26.030734.parquet'
- config_name: results
data_files:
- split: 2023_07_24T15_10_16.061050
path:
- results_2023-07-24T15:10:16.061050.parquet
- split: 2023_10_12T23_46_48.107460
path:
- results_2023-10-12T23-46-48.107460.parquet
- split: 2023_10_13T11_27_26.030734
path:
- results_2023-10-13T11-27-26.030734.parquet
- split: latest
path:
- results_2023-10-13T11-27-26.030734.parquet
---
# Dataset Card for Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Tap-M/Luna-AI-Llama2-Uncensored](https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T11:27:26.030734](https://huggingface.co/datasets/open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored/blob/main/results_2023-10-13T11-27-26.030734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19651845637583892,
"em_stderr": 0.004069389648475156,
"f1": 0.26367764261744986,
"f1_stderr": 0.0041324301979436965,
"acc": 0.413131375387228,
"acc_stderr": 0.010360509171200127
},
"harness|drop|3": {
"em": 0.19651845637583892,
"em_stderr": 0.004069389648475156,
"f1": 0.26367764261744986,
"f1_stderr": 0.0041324301979436965
},
"harness|gsm8k|5": {
"acc": 0.09855951478392722,
"acc_stderr": 0.008210320350946319
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gayanin/pubmed-abstracts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: refs
dtype: string
splits:
- name: train
num_bytes: 9419993
num_examples: 74724
- name: test
num_bytes: 1206965
num_examples: 9341
- name: validation
num_bytes: 1239760
num_examples: 9341
download_size: 6522287
dataset_size: 11866718
---
# Dataset Card for "pubmed-abstracts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZhankuiHe/reddit_movie_small_v1 | ---
task_categories:
- conversational
language:
- en
tags:
- recommendation
---
# Dataset Card for `Reddit-Movie-small-V1`
## Dataset Description
- **Homepage:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys
- **Repository:** https://github.com/AaronHeee/LLMs-as-Zero-Shot-Conversational-RecSys
- **Paper:** To appear
- **Point of Contact:** zhh004@eng.ucsd.edu
### Dataset Summary
This dataset contains the recommendation-related conversations in movie domain, only for research use in e.g., conversational recommendation, long-query retrieval tasks.
This dataset is ranging from Jan. 2022 to Dec. 2022. Another larger version dataset (from Jan. 2012 to Dec. 2022) can be found [here](https://huggingface.co/datasets/ZhankuiHe/reddit_movie_large_v1).
### Dataset Processing
We dump [Reddit](https://reddit.com) conversations from [pushshift.io](https://pushshift.io), converted them into [raw text](https://huggingface.co/datasets/ZhankuiHe/reddit_movie_raw) on Reddit about movie recommendations from five subreddits:
- [r/movies](https://www.reddit.com/r/movies/)
- [r/moviesuggestions](https://www.reddit.com/r/suggestions/)
- [r/bestofnetflix](https://www.reddit.com/r/bestofnetflix/)
- [r/nextflixbestof](https://www.reddit.com/r/netflixbestof/)
- [r/truefilm](https://www.reddit.com/r/truefilm/)
After that, we process them by:
1. extracting movie recommendation conversations;
2. recognizing movie mentions in raw text;
3. linking movie mentions to existing movie entities in [IMDB](https://imdb.com) database.
Since the raw text is quite noisy and processing is not perfect, we do observe some failure cases in our processed data. Thus we use V1 to highlight that this processed version is the first verion. Welcome to contribute to cleaner processed versions (such as V2) in the future, many thanks!
### Disclaimer
⚠️ **Please note that conversations processed from Reddit raw data may include content that is not entirely conducive to a positive experience (e.g., toxic speech). Exercise caution and discretion when utilizing this information.**
## Dataset Structure
### Data Fields
- `id2name.json` provides a lookup table (dictionary) from `itemid` (e.g., `tt0053779`) to `itemname` (e.g., `La Dolce Vita (1960)`). Note that, the `itemid` is from [IMDB](https://imdb.com), so that it can be used to align other movie recommendation datasets sharing the same `itemid`, such as [MovieLens](https://movielens.org/).
- `{train, valid, test}.csv` are question-answer pairs that can be used for training, validation and testing (split by the dialog created timestamp in their chronological order, ranging from far to recent). There are 12 columns in these `*.csv` files:
- `conv_id (string)`: Conversational ID. Since our conversations are collected from reddit posts, we generate conversations by extracting paths in a reddit thread with different replies. An example of `conv_id` is:
```
"t3_rt7enj_0/14" # -> t3_rt7enj is the ID of the first post in the thread, 0 means this is the first path extracted from this thread, and 13 means there are 13 paths in total.
```
- `turn_id (string)`: Conversational turn ID. For example:
```
"t3_rt7enj" # -> We can use (conv_id, turn_id) to uniquely define a row in this dataset.
```
- `turn_order (int64)`: No.X turn in a given conversation, which can be used to sort turns within the conversation. For example:
```
0 # -> It is the first turn in this conversation. Typically, for conversations from Reddit, the number of turns is usually not very large.
```
- `user_id (string)`: The unique user id. For example:
```
"t2_fweij" # -> user id
```
- `is_seeker (bool)`: Whether the speaker at the current turn is the seeker for recommendation or not. For example
```
true # -> It is the seeker (seeker starts a movie requesting conversation on Reddit).
```
- `utc_time (int64)`: The UTC timestamp when this conversation turn happend. For example:
```
1641234238 # -> Try `datetime.fromtimestamp(1641234238)`
```
- `upvotes (int64)`: The number of upvotes from other reddit users (it is `null` if this post is the first post in this thread, because upvotes only work for replies.). For example:
```
6 # -> 6 upvotes from other Reddit users.
```
- `processed (string)`: The role and text at this conversation turn (processed version). For example:
```
"['USER', 'We decided on tt3501632. They love it so far— very funny!']" # -> [ROLE, Processed string] after `eval()`, where we can match `tt3501632` to real item name using `id2name.json`.
```
- `raw (int64)`: The role and text at conversation turn (raw-text version). For example:
```
"['USER', 'We decided on Thor: Ragnarok. They love it so far— very funny!']" # -> [ROLE, Raw string] after `eval()`, where it is convinient to form it as "USER: We decided on Thor: Ragnarok. They love it so far— very funny!".
```
- `context_processed (string)`: The role and text pairs as the historical conversation context (processed version). For example:
```
"[['USER', 'It’s summer break ... Some of the films we have watched (and they enjoyed) in the past are tt3544112, tt1441952, tt1672078, tt0482571, tt0445590, tt0477348...'], ['SYSTEM', "I'm not big on super hero movies, but even I loved the tt2015381 movies ..."]]"
# -> [[ROLE, Processed string], [ROLE, Processed string], ...] after `eval()`, where we can match `tt******` to real item name using `id2name.json`.
```
- `context_raw (string)`: The role and text pairs as the historical conversation context (raw version). For example:
```
"[['USER', 'It’s summer break ... Some of the films we have watched (and they enjoyed) in the past are Sing Street, Salmon Fishing in the Yemen, The Life of Pi, The Prestige, LOTR Trilogy, No Country for Old Men...'], ['SYSTEM', "I'm not big on super hero movies, but even I loved the guardians of the Galaxy movies ..."]]"
# -> [[ROLE, Processed string], [ROLE, Processed string], ...] after `eval()`, where we can form "USER: ...\n SYSTEM: ...\n USER:..." easily.
```
- `context_turn_ids (string)`: The conversation context turn_ids associated with context [ROLE, Processed string] pairs. For example:
```
"['t3_8voapb', 't1_e1p0f5h'] # -> This is the `turn_id`s for the context ['USER', 'It’s summer break ...'], ['SYSTEM', "I'm not big on super hero movie...']. They can used to retrieve more related information like `utc_time` after combining with `conv_id`.
```
### Data Splits
We hold the last 20% data (in chronological order according to the created time of the conversation) as testing set. Others can be treated as training samples. We provided a suggested split to split Train into Train and Validation but you are free to try your splits.
| | Total | Train + Validation | Test |
| - | - | - | - |
| #Conv. | 171,773 | 154,597 | 17,176 |
| #Turns | 419,233 | 377,614 | 41,619 |
| #Users | 12,508 | 11,477 | 1,384 |
| #Items | 31,396 | 30,146 | 10,434 |
### Citation Information
Please cite these two papers if you used this dataset, thanks!
```bib
@inproceedings{he23large,
title = Large language models as zero-shot conversational recommenders",
author = "Zhankui He and Zhouhang Xie and Rahul Jha and Harald Steck and Dawen Liang and Yesu Feng and Bodhisattwa Majumder and Nathan Kallus and Julian McAuley",
year = "2023",
booktitle = "CIKM"
}
```
```bib
@inproceedings{baumgartner2020pushshift,
title={The pushshift reddit dataset},
author={Baumgartner, Jason and Zannettou, Savvas and Keegan, Brian and Squire, Megan and Blackburn, Jeremy},
booktitle={Proceedings of the international AAAI conference on web and social media},
volume={14},
pages={830--839},
year={2020}
}
```
Please contact [Zhankui He](https://aaronheee.github.io) if you have any questions or suggestions.
|
open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B | ---
pretty_name: Evaluation run of Yuma42/KangalKhan-RawEmerald-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yuma42/KangalKhan-RawEmerald-7B](https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T19:35:28.088333](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B/blob/main/results_2024-02-17T19-35-28.088333.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6352525606759045,\n\
\ \"acc_stderr\": 0.032219840957825416,\n \"acc_norm\": 0.6367472857897187,\n\
\ \"acc_norm_stderr\": 0.03286282029453568,\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5757953283863733,\n\
\ \"mc2_stderr\": 0.01546017808791804\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407158,\n\
\ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.01375206241981783\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6743676558454491,\n\
\ \"acc_stderr\": 0.004676529200753001,\n \"acc_norm\": 0.8574985062736507,\n\
\ \"acc_norm_stderr\": 0.003488493442243211\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579658,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579658\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834838,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834838\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083376,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013003,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013003\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5757953283863733,\n\
\ \"mc2_stderr\": 0.01546017808791804\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6285064442759667,\n \
\ \"acc_stderr\": 0.01330983907570649\n }\n}\n```"
repo_url: https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-35-28.088333.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-35-28.088333.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- '**/details_harness|winogrande|5_2024-02-17T19-35-28.088333.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T19-35-28.088333.parquet'
- config_name: results
data_files:
- split: 2024_02_17T19_35_28.088333
path:
- results_2024-02-17T19-35-28.088333.parquet
- split: latest
path:
- results_2024-02-17T19-35-28.088333.parquet
---
# Dataset Card for Evaluation run of Yuma42/KangalKhan-RawEmerald-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-RawEmerald-7B](https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T19:35:28.088333](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B/blob/main/results_2024-02-17T19-35-28.088333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6352525606759045,
"acc_stderr": 0.032219840957825416,
"acc_norm": 0.6367472857897187,
"acc_norm_stderr": 0.03286282029453568,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5757953283863733,
"mc2_stderr": 0.01546017808791804
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407158,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.01375206241981783
},
"harness|hellaswag|10": {
"acc": 0.6743676558454491,
"acc_stderr": 0.004676529200753001,
"acc_norm": 0.8574985062736507,
"acc_norm_stderr": 0.003488493442243211
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579658,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579658
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834838,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083376,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013003,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013003
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5757953283863733,
"mc2_stderr": 0.01546017808791804
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.6285064442759667,
"acc_stderr": 0.01330983907570649
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
polinaeterna/push_to_hub_single_config | ---
builder_config:
data_files:
- split: train
pattern: data/train-*
- split: random
pattern: data/random-*
dataset_info:
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 800
num_examples: 50
download_size: 4042
dataset_size: 2400
---
# Dataset Card for "push_to_hub_single_config"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nateraw/pedro-embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 15978849.0
num_examples: 161
download_size: 16163063
dataset_size: 15978849.0
---
# Dataset Card for "pedro-embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/kamikazekaitoujeanne | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Kamikaze Kaitou Jeanne
This is the image base of bangumi Kamikaze Kaitou Jeanne, we detected 43 characters, 3600 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 527 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 39 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 690 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 38 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 33 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 26 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 110 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 23 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 28 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 435 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 351 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 45 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 69 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 14 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 78 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 54 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 21 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 23 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 45 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 18 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 159 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 22 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 103 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 42 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 12 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 14 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 30 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 15 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 12 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 18 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 12 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 11 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 208 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 28 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 24 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 13 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 34 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 7 | [Download](37/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 38 | 12 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 19 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 12 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 17 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 109 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
DynamicSuperb/AutomaticSpeechRecognition_LJSpeech | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 3800884574.0
num_examples: 13100
download_size: 3785131725
dataset_size: 3800884574.0
---
# Dataset Card for "AutomaticSpeechRecognition_LJSpeech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sustcsenlp/bn_emotion_noisy_dataset | ---
license: other
task_categories:
- text-classification
multilinguality:
- monolingual
language:
- bn
pretty_name: EmoNoBa
task_ids:
- multi-class-classification
- multi-label-classification
tags:
- emotion
paperswithcode_id: emonoba
---
# Dataset Card for "EmoNoBa"
### Dataset Summary
Detecting Multi-labeled Emotion for 6 emotion categories, namely Love, Joy, Surprise, Anger, Sadness, Fear.
### Citation Information
```
@inproceedings{islam2022emonoba,
title={EmoNoBa: A Dataset for Analyzing Fine-Grained Emotions on Noisy Bangla Texts},
author={Islam, Khondoker Ittehadul and Yuvraz, Tanvir and Islam, Md Saiful and Hassan, Enamul},
booktitle={Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing},
pages={128--134},
year={2022}
}
``` |
feynman-integrals-nn/box1m | ---
license: cc-by-4.0
---
# box1m
* [data](https://huggingface.co/datasets/feynman-integrals-nn/box1m)
* [source](https://gitlab.com/feynman-integrals-nn/feynman-integrals-nn/-/tree/main/box1m)
|
open-llm-leaderboard/details_DatPySci__pythia-1b-self-kto-iter0 | ---
pretty_name: Evaluation run of DatPySci/pythia-1b-self-kto-iter0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DatPySci/pythia-1b-self-kto-iter0](https://huggingface.co/DatPySci/pythia-1b-self-kto-iter0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DatPySci__pythia-1b-self-kto-iter0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T16:48:04.697612](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-self-kto-iter0/blob/main/results_2024-02-29T16-48-04.697612.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24517197390528758,\n\
\ \"acc_stderr\": 0.030243484930686515,\n \"acc_norm\": 0.2463778078238585,\n\
\ \"acc_norm_stderr\": 0.03097780326189448,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396731,\n \"mc2\": 0.3634898230906463,\n\
\ \"mc2_stderr\": 0.014283283527865656\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2721843003412969,\n \"acc_stderr\": 0.013006600406423704,\n\
\ \"acc_norm\": 0.30204778156996587,\n \"acc_norm_stderr\": 0.013417519144716417\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38717386974706236,\n\
\ \"acc_stderr\": 0.0048610845340870245,\n \"acc_norm\": 0.49063931487751444,\n\
\ \"acc_norm_stderr\": 0.004988906901307737\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101456,\n\
\ \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101456\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.16,\n\
\ \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.16,\n \
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029471,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029471\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.02694748312149623,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.02694748312149623\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.19310344827586207,\n \"acc_stderr\": 0.032894455221273995,\n\
\ \"acc_norm\": 0.19310344827586207,\n \"acc_norm_stderr\": 0.032894455221273995\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.02293097307163335,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.02293097307163335\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.23548387096774193,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.23548387096774193,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2315270935960591,\n \"acc_stderr\": 0.02967833314144444,\n \"\
acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.02967833314144444\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.030117688929503585,\n\
\ \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.030117688929503585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390989,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390989\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246797,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246797\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184408,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184408\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3357798165137615,\n \"acc_stderr\": 0.020248081396752923,\n \"\
acc_norm\": 0.3357798165137615,\n \"acc_norm_stderr\": 0.020248081396752923\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828978,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828978\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.32286995515695066,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.039578354719809784,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.039578354719809784\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.02905858830374884,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.02905858830374884\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n\
\ \"acc_stderr\": 0.016225017944770954,\n \"acc_norm\": 0.28991060025542786,\n\
\ \"acc_norm_stderr\": 0.016225017944770954\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225619,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225619\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341005,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341005\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729903,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729903\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.02888819310398865,\n\
\ \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.02888819310398865\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528034,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528034\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984927,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984927\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17551020408163265,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.17551020408163265,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.03384429155233135,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.03384429155233135\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396731,\n \"mc2\": 0.3634898230906463,\n\
\ \"mc2_stderr\": 0.014283283527865656\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5343330702446725,\n \"acc_stderr\": 0.014019317531542563\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \
\ \"acc_stderr\": 0.003681611894073871\n }\n}\n```"
repo_url: https://huggingface.co/DatPySci/pythia-1b-self-kto-iter0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|arc:challenge|25_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|gsm8k|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hellaswag|10_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-48-04.697612.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T16-48-04.697612.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- '**/details_harness|winogrande|5_2024-02-29T16-48-04.697612.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T16-48-04.697612.parquet'
- config_name: results
data_files:
- split: 2024_02_29T16_48_04.697612
path:
- results_2024-02-29T16-48-04.697612.parquet
- split: latest
path:
- results_2024-02-29T16-48-04.697612.parquet
---
# Dataset Card for Evaluation run of DatPySci/pythia-1b-self-kto-iter0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-self-kto-iter0](https://huggingface.co/DatPySci/pythia-1b-self-kto-iter0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DatPySci__pythia-1b-self-kto-iter0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T16:48:04.697612](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-self-kto-iter0/blob/main/results_2024-02-29T16-48-04.697612.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24517197390528758,
"acc_stderr": 0.030243484930686515,
"acc_norm": 0.2463778078238585,
"acc_norm_stderr": 0.03097780326189448,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396731,
"mc2": 0.3634898230906463,
"mc2_stderr": 0.014283283527865656
},
"harness|arc:challenge|25": {
"acc": 0.2721843003412969,
"acc_stderr": 0.013006600406423704,
"acc_norm": 0.30204778156996587,
"acc_norm_stderr": 0.013417519144716417
},
"harness|hellaswag|10": {
"acc": 0.38717386974706236,
"acc_stderr": 0.0048610845340870245,
"acc_norm": 0.49063931487751444,
"acc_norm_stderr": 0.004988906901307737
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029471,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029471
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.02694748312149623,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.02694748312149623
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.19310344827586207,
"acc_stderr": 0.032894455221273995,
"acc_norm": 0.19310344827586207,
"acc_norm_stderr": 0.032894455221273995
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.02293097307163335,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.02293097307163335
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.02967833314144444,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.02967833314144444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.030117688929503585,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.030117688929503585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390989,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390989
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184408,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184408
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671548,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3357798165137615,
"acc_stderr": 0.020248081396752923,
"acc_norm": 0.3357798165137615,
"acc_norm_stderr": 0.020248081396752923
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828978,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828978
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.039578354719809784,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.039578354719809784
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.02905858830374884,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.02905858830374884
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770954,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770954
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225619,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225619
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341005,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341005
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729903,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729903
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.02888819310398865,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.02888819310398865
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528034,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528034
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984927,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984927
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17551020408163265,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.17551020408163265,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233135,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233135
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396731,
"mc2": 0.3634898230906463,
"mc2_stderr": 0.014283283527865656
},
"harness|winogrande|5": {
"acc": 0.5343330702446725,
"acc_stderr": 0.014019317531542563
},
"harness|gsm8k|5": {
"acc": 0.01819560272934041,
"acc_stderr": 0.003681611894073871
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
somosnlp/constitucion-politica-del-peru-1993-qa | ---
dataset_info:
features:
- name: pregunta
dtype: string
- name: respuesta
dtype: string
- name: fuente
dtype: string
- name: tema
dtype: string
splits:
- name: train
num_bytes: 1383707
num_examples: 1660
- name: test
num_bytes: 338014
num_examples: 415
download_size: 725593
dataset_size: 1721721
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: apache-2.0
task_categories:
- summarization
- text-generation
- question-answering
language:
- es
tags:
- law
- legal
- leyes
- peru
- constitucion
pretty_name: constitucionperu
size_categories:
- 1K<n<10K
---
---
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/64461026e1fd8d65b27e6187/UATPzCS74IYpJAqhyQm4G.jpeg" style="width: 30%;">
</p>
<h1 align="center"> QA Legal basado en la Constitución Política del Perú de 1993.</h1>
# Constitucion-politica-del-peru-1993-qa
## Dataset Description
**Domain**:
**Domain**:
El dataset se centra en instrucciones extraídas basadas en secciones de la Constitución Política del Perú de 1993. Este documento incluye la propia Constitución Política, Titulos, Capitulos, Articulos, Leyes y disposiciones legales relacionadas con los derechos, deberes y situación de los ciudadanos peruanos. El objetivo es proporcionar material legal relevante para la formación de modelos de preguntas y respuestas específicamente sobre la legislación peruana, utilizando la biblioteca [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/).
---
### Dataset Summary
Compuesto por unos 2075 registros que contienen los campos:
* pregunta: pregunta que sirve como una instrucción o consulta sobre algún aspecto de la Constitución Política del Perú de 1993.
* respuesta: La respuesta proporcionada para cada pregunta es un contexto relevante que ayuda a resolver la consulta. Este contexto es un extracto de la Constitución.
* fuente: Para cada respuesta, se indica el capítulo y/o artículo de la Constitución Política del Perú de 1993 relacionado con la misma.
* tema: Se asigna una clasificación temática a cada respuesta, indicando el área específica de la Constitución abordada
---
### Idioma
- Español (es_pe)
## Estructura del Dataset
### Instancia de datos
<pre>
{'pregunta': '¿Cuáles son los derechos que posee el acusado durante este procedimiento, según el Artículo 98 de la Constitución Política del Perú?',
'respuesta': 'De acuerdo con el Artículo 98 de la Constitución Política del Perú, el acusado tiene derecho a la defensa y a un debido proceso. Esto incluye el derecho a ser informado de los cargos en su contra, a contar con tiempo y medios adecuados para su defensa, a no declarar contra sí mismo y a un juicio justo y público en un plazo razonable.',
'fuente': 'Constitución Política del Perú 1993 - Capítulo II, Artículo 98',
'tema': 'Educación; Conflictos sociales; Prevención de la corrupción; Servicios públicos; Violencia contra la niñez; Desigualdad y violencia hacia las mujeres; Seguridad ciudadana; Discapacidad; Salud',
}
</pre>
## Dataset Creation
Para la creación del dataset hemos empleado la libreria [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/) que proporciona un completo juego de herramientas que facilitan y agilizan enormemente esta labor de creación.
El proceso se dividia en 3 etapas:
- Primer paso: En base a un contexto dado de 256 tokens proveniente de la constitucion, solicitamos al LLM generar instrucciones o preguntas que pudieran resolverse con dicho contexto. Usamos un tarea de tipo `SelfInstructTask` y proporcionamos a la tarea una descripción o rol de asistente IA. Invocamos al modelo "mistralai/Mixtral-8x7B-Instruct-v0.1" en un Inference Endpoint en Hugging Face. Como resultado de este proceso obteniamos 2-3 instrucciones o preguntas por cada contexto proporcionado.
- Segundo paso: Generamos un prompt en formato similar a alpaca, donde dada una Instruccion y un input o contexto, se solicita una generacion al LLM.
- Tercer paso: En base a los prompts generados en el paso anterior, usando una tarea de tipo `TextGenerationTask` y con la descripción de aistente AI del paso 1, solicitamos al LLM que nos proporcione la respuesta.
### Personal and Sensitive Information
No se incluye información personal o sensible.
### Repositorio Github (noteboooks)
https://github.com/mcdaqc/Kuntur
## Licensing information
This work is licensed under [Apache License Version 2.0, January 2004](https://www.apache.org/licenses/LICENSE-2.0) License.
## Citation Information
```
``` |
jxie/aloi | ---
dataset_info:
features:
- name: inputs
sequence: float64
- name: label
dtype: float64
splits:
- name: train
num_bytes: 71608320
num_examples: 69120
- name: val
num_bytes: 17902080
num_examples: 17280
- name: test
num_bytes: 22377600
num_examples: 21600
download_size: 4459430
dataset_size: 111888000
---
# Dataset Card for "aloi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sordonia/t0-1.6M-flat | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: task_source
dtype: string
splits:
- name: train
num_bytes: 1996543336
num_examples: 1600000
download_size: 1133389101
dataset_size: 1996543336
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "t0-1.6M-flat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yanbingzheng/LongBench | ---
task_categories:
- question-answering
- text-generation
- summarization
- conversational
- text-classification
language:
- en
- zh
tags:
- Long Context
size_categories:
- 1K<n<10K
---
# Introduction
**LongBench** is the first benchmark for bilingual, multitask, and comprehensive assessment of **long context understanding** capabilities of large language models. LongBench includes different languages (Chinese and English) to provide a more comprehensive evaluation of the large models' multilingual capabilities on long contexts. In addition, LongBench is composed of six major categories and twenty different tasks, covering key long-text application scenarios such as multi-document QA, single-document QA, summarization, Few-shot learning, code completion, and synthesis tasks.
We are fully aware of the potentially high costs involved in the model evaluation process, especially in the context of long context scenarios (such as manual annotation costs or API call costs). Therefore, we adopt a fully automated evaluation method, aimed at measuring and evaluating the model's ability to understand long contexts at the lowest cost.
LongBench includes 13 English tasks, 5 Chinese tasks, and 2 code tasks, with the average length of most tasks ranging from 5k to 15k.
Github Repo for LongBench: https://github.com/THUDM/LongBench
# How to use it?
#### Loading Data
```python
from datasets import load_dataset
datasets = ["hotpotqa", "2wikimqa", "musique", "dureader", "narrativeqa", "qasper", "multifieldqa_en", \
"multifieldqa_zh", "gov_report", "qmsum", "vcsum", "trec", "nq", "triviaqa", "lsht", "passage_count", \
"passage_retrieval_en", "passage_retrieval_zh", "lcc", "repobench-p"]
for dataset in datasets:
data = load_dataset('THUDM/LongBench', dataset, split='test')
```
#### Data Format
All data in **LongBench** are standardized to the following format:
```json
{
"input": "The input/command for the task, usually short, such as questions in QA, queries in Few-shot tasks, etc.",
"context": "The long context text required for the task, such as documents, cross-file code, few-shot samples in Few-shot tasks",
"answers": "List composed of all standard answers",
"length": "Total length of the first three items of text (counted in characters for Chinese and words for English)",
"dataset": "The name of the dataset to which this piece of data belongs",
"language": "The language of this piece of data",
"all_classes": "All categories in classification tasks, null for non-classification tasks",
"_id": "Random id for each piece of data"
}
```
#### Evaluation
This repository provides data download for LongBench. If you wish to use this dataset for automated evaluation, please refer to our [github](https://github.com/THUDM/LongBench).
# Task statistics
| Task | Task Type | Eval metric | Avg len |Language | \#Sample |
| :-------- | :-----------:| :-----------: |:-------: | :-----------: |:--------: |
| HotpotQA | Multi-doc QA | F1 |9,149 |EN |200 |
| 2WikiMultihopQA| Multi-doc QA | F1 |4,885 |EN |200 |
| Musique| Multi-doc QA | F1 |7,798 |EN |200 |
| DuReader| Multi-doc QA | Rouge-L |15,768 |ZH |200 |
| MultiFieldQA-en| Single-doc QA | F1 |4,559 |EN |150 |
| MultiFieldQA-zh| Single-doc QA | F1 |6,771 |ZH |200 |
| NarrativeQA| Single-doc QA | F1 |18,405 |EN |200 |
| Qasper| Single-doc QA | F1 |3,619 |EN |200 |
| GovReport| Summarization | Rouge-L |8,169 |EN |200 |
| QMSum| Summarization | Rouge-L |10,546 |EN |200 |
| VCSUM| Summarization | Rouge-L |15,147 |ZH |200 |
| TriviaQA| Few shot | F1 |8,015 |EN |200 |
| NQ| Few shot | F1 |8,210 |EN |200 |
| TREC| Few shot | Accuracy |5,176 |EN |200 |
| LSHT| Few shot | Accuracy |22,333 |ZH |200 |
| PassageRetrieval-en| Synthetic | Accuracy |9,288 |EN |200 |
| PassageCount| Synthetic | Accuracy |11,141 |EN |200 |
| PassageRetrieval-zh | Synthetic | Accuracy |6,745 |ZH |200 |
| LCC| Code | Edit Sim |1,235 |Python/C#/Java |500 |
| RepoBench-P| Code | Edit Sim |5,622 |Python/Java |500 |
> Note: In order to avoid discrepancies caused by different tokenizers, we use the word count (using Python's split function) to calculate the average length of English datasets and code datasets, and use the character count to calculate the average length of Chinese datasets.
# Task description
| Task | Task Description |
| :---------------- | :----------------------------------------------------------- |
| HotpotQA | Answer related questions based on multiple given documents |
| 2WikiMultihopQA | Answer related questions based on multiple given documents |
| Musique | Answer related questions based on multiple given documents |
| DuReader | Answer related Chinese questions based on multiple retrieved documents |
| MultiFieldQA-en | Answer English questions based on a long article, which comes from a relatively diverse field |
| MultiFieldQA-zh | Answer Chinese questions based on a long article, which comes from a relatively diverse field |
| NarrativeQA | Ask questions based on stories or scripts, including understanding of important elements such as characters, plots, themes, etc. |
| Qasper | Ask questions based on a NLP research paper, questions proposed and answered by NLP practitioners |
| GovReport | A summarization task that requires summarizing government work reports |
| QMSum | A summarization task that requires summarizing meeting records based on user queries |
| VCSUM | A summarization task that requires summarizing Chinese meeting records |
| TriviaQA | Single document question answering task, providing several few-shot examples |
| NQ | Single document question answering task, providing several few-shot examples |
| TREC | A classification task that requires categorizing questions, includes 50 categories in total |
| LSHT | A Chinese classification task that requires categorizing news, includes 24 categories in total |
| PassageRetrieval-en | Given 30 English Wikipedia paragraphs, determine which paragraph the given summary corresponds to |
| PassageCount | Determine the total number of different paragraphs in a given repetitive article |
| PassageRetrieval-zh | Given several Chinese paragraphs from the C4 data set, determine which paragraph the given abstract corresponds to |
| LCC | Given a long piece of code, predict the next line of code |
| RepoBench-P | Given code in multiple files within a GitHub repository (including cross-file dependencies), predict the next line of code |
# Task construction
> Note: For all tasks constructed from existing datasets, we use data from the validation or test set of the existing dataset (except for VCSUM).
- The tasks of [HotpotQA](https://hotpotqa.github.io/), [2WikiMultihopQA](https://aclanthology.org/2020.coling-main.580/), [Musique](https://arxiv.org/abs/2108.00573), and [DuReader](https://github.com/baidu/DuReader) are built based on the original datasets and processed to be suitable for long context evaluation. Specifically, for questions in the validation set, we select the evidence passage that contains the answer and several distracting articles. These articles together with the original question constitute the input of the tasks.
- The tasks of MultiFiedQA-zh and MultiFieldQA-en consist of long artical data from about 10 sources, including Latex papers, judicial documents, government work reports, and PDF documents indexed by Google. For each long artical, we invite several PhD and master students to annotate, i.e., to ask questions based on the long artical and give the correct answers. To better automate evaluation, we ask the annotators to propose questions with definitive answers as much as possible.
- The tasks of [NarrativeQA](https://arxiv.org/pdf/1712.07040.pdf), [Qasper](https://arxiv.org/pdf/2105.03011.pdf), [GovReport](https://arxiv.org/pdf/2104.02112.pdf), and [QMSum](https://arxiv.org/pdf/2104.05938.pdf) directly use the data provided by the original papers. In the specific construction, we use the template provided by [ZeroSCROLLS](https://www.zero.scrolls-benchmark.com/) to convert the corresponding data into pure text input.
- The [VCSUM](https://arxiv.org/abs/2305.05280) task is built based on the original dataset, and we design a corresponding template to convert the corresponding data into pure text input.
- The tasks of [TriviaQA](https://nlp.cs.washington.edu/triviaqa/) and [NQ](https://ai.google.com/research/NaturalQuestions/) are constructed in the manner of [CoLT5](https://arxiv.org/abs/2303.09752), which provides several examples of question and answering based on documents, and requires the language model to answer related questions based on new documents.
- The tasks of [TREC](https://aclanthology.org/C02-1150.pdf) and [LSHT](http://tcci.ccf.org.cn/conference/2014/dldoc/evatask6.pdf) are built based on the original datasets. For each question in the validation set, we sample several data from the training set to form few-shot examples. These examples together with the questions in the validation set constitute the input for this task.
- The PassageRetrieval-en task is constructed based on English Wikipedia. For each piece of data, we randomly sample 30 paragraphs from English Wikipedia and select one for summarization (using GPT-3.5-Turbo). This task requires the model to give the original paragraph name to which the summary corresponds.
- The PassageCount task is constructed based on the English wiki. For each piece of data, we randomly sample several passages from English Wikipedia, repeat each paragraph at random several times, and finally shuffle the paragraphs. This task requires the model to determine the total number of different paragraphs in the given context.
- The PasskeyRetrieval-zh task is constructed based on [C4](https://arxiv.org/abs/1910.10683). For each piece of data, we randomly sample several Chinese paragraphs from C4 and select one of them for summarization (using GPT-3.5-Turbo). This task requires the model to give the original paragraph name to which the summary corresponds.
- For the [LCC](https://arxiv.org/abs/2306.14893) task, we sample from the original code completion dataset. In the [RepoBench-P](https://arxiv.org/abs/2306.03091) task, we select the most challenging XF-F (Cross-File-First) setting from the original dataset and refer to the Oracle-Filled scenario in the paper. For each original piece of data, we randomly extract multiple cross-file code snippets, including the gold cross-file code snippet, and concatenate them as input, requiring the model to effectively use cross-file code for completion. |
japanese-asr/whisper_transcriptions.reazonspeech.all_58 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30456251442.0
num_examples: 267381
download_size: 30219117107
dataset_size: 30456251442.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
yangtao9009/OSTv2 | ---
license: apache-2.0
---
|
irds/medline_2004_trec-genomics-2004 | ---
pretty_name: '`medline/2004/trec-genomics-2004`'
viewer: false
source_datasets: ['irds/medline_2004']
task_categories:
- text-retrieval
---
# Dataset Card for `medline/2004/trec-genomics-2004`
The `medline/2004/trec-genomics-2004` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/medline#medline/2004/trec-genomics-2004).
# Data
This dataset provides:
- `queries` (i.e., topics); count=50
- `qrels`: (relevance assessments); count=8,268
- For `docs`, use [`irds/medline_2004`](https://huggingface.co/datasets/irds/medline_2004)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/medline_2004_trec-genomics-2004', 'queries')
for record in queries:
record # {'query_id': ..., 'title': ..., 'need': ..., 'context': ...}
qrels = load_dataset('irds/medline_2004_trec-genomics-2004', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Hersh2004TrecGenomics,
title={TREC 2004 Genomics Track Overview},
author={William R. Hersh and Ravi Teja Bhuptiraju and Laura Ross and Phoebe Johnson and Aaron M. Cohen and Dale F. Kraemer},
booktitle={TREC},
year={2004}
}
```
|
matrixhanson/dataset1 | ---
license: openrail
---
|
pxovela/merab_6_longer_adjusted_captions | ---
license: openrail
---
|
breadlicker45/midi-hex-data | ---
license: other
---
|
ShrinivasSK/te_en_3 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: tgt
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 4067874.0
num_examples: 18000
- name: test
num_bytes: 451986.0
num_examples: 2000
download_size: 2432870
dataset_size: 4519860.0
---
# Dataset Card for "te_en_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_vanillaOVO__supermario_v2 | ---
pretty_name: Evaluation run of vanillaOVO/supermario_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vanillaOVO/supermario_v2](https://huggingface.co/vanillaOVO/supermario_v2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vanillaOVO__supermario_v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T17:31:48.742417](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v2/blob/main/results_2024-01-25T17-31-48.742417.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559999152632306,\n\
\ \"acc_stderr\": 0.032043161212488,\n \"acc_norm\": 0.6552568031435213,\n\
\ \"acc_norm_stderr\": 0.03271482699591929,\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7122019171893207,\n\
\ \"mc2_stderr\": 0.014799117031048576\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.01331852846053942,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n\
\ \"acc_stderr\": 0.004516215206715354,\n \"acc_norm\": 0.8852818163712408,\n\
\ \"acc_norm_stderr\": 0.0031803024181887253\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.02361088430892786,\n \
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.016553287863116037,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.016553287863116037\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"\
acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7122019171893207,\n\
\ \"mc2_stderr\": 0.014799117031048576\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337697\n }\n}\n```"
repo_url: https://huggingface.co/vanillaOVO/supermario_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|arc:challenge|25_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|gsm8k|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hellaswag|10_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T17-31-48.742417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T17-31-48.742417.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- '**/details_harness|winogrande|5_2024-01-25T17-31-48.742417.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T17-31-48.742417.parquet'
- config_name: results
data_files:
- split: 2024_01_25T17_31_48.742417
path:
- results_2024-01-25T17-31-48.742417.parquet
- split: latest
path:
- results_2024-01-25T17-31-48.742417.parquet
---
# Dataset Card for Evaluation run of vanillaOVO/supermario_v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vanillaOVO/supermario_v2](https://huggingface.co/vanillaOVO/supermario_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vanillaOVO__supermario_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T17:31:48.742417](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v2/blob/main/results_2024-01-25T17-31-48.742417.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6559999152632306,
"acc_stderr": 0.032043161212488,
"acc_norm": 0.6552568031435213,
"acc_norm_stderr": 0.03271482699591929,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7122019171893207,
"mc2_stderr": 0.014799117031048576
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.01331852846053942,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715354,
"acc_norm": 0.8852818163712408,
"acc_norm_stderr": 0.0031803024181887253
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.016553287863116037,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.016553287863116037
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7122019171893207,
"mc2_stderr": 0.014799117031048576
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785722
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PJMixers/grimulkan_bluemoon_Karen_cleaned-carded | ---
language:
- en
source_datasets: grimulkan/bluemoon_Karen_cleaned
tags:
- not-for-all-audiences
- roleplay
- role-play
- role play
- rp
- bluemoon
- blue moon
---
A system prompt was added to each item. Was made using a Mistral Perscengen Remake QLoRA format similar to [lemonilia/LimaRP-perscengen-v5](https://huggingface.co/lemonilia/LimaRP-perscengen-v5), and a [python script](https://gist.github.com/xzuyn/fe00ae8895550f3bfaddaa773e55146e) to verify the output format is correct (not if the information is correct though).
In the script it has 5 chances to output in the correct format, if it fails that, it skips it. The QLoRA is fairly consistent, but not perfect, so it's probably better to skip more than less.
This is how the system prompt is formatted. The tags are always in this same order, with the same amount of new lines.
It should be consistent to where you can write a script to pull the data and change to how you want it to be (or at the very least you can easily text-replace the tags with terms you'd rather use).
Here is an example.
```
<|FIRST_CHARACTER_NAME|>The Beast
<|SECOND_CHARACTER_NAME|>Belle
<|FIRST_CHARACTER_DESCRIPTION|>A mysterious and intimidating figure, resembling a beast with a cape swishing behind him. He has an imposing presence, which he uses to assert dominance over others in his castle. His personality is stern and authoritative; he is not afraid to enforce rules or punish those who disobey him. Despite this harsh exterior, The Beast also displays signs of vulnerability and loneliness.
<|SECOND_CHARACTER_DESCRIPTION|>A young brunette woman with a strong sense of self-reliance and determination. She's resourceful and quick-thinking, often taking charge in situations that require decisive action. Her compassionate nature shines through when it comes to helping others, especially her father whom she deeply cares for. Despite the challenges she faces, Belle maintains an optimistic outlook on life and isn't afraid to stand up against adversity.
<|SCENARIO_SUMMARY|>A young woman named Belle goes to a castle in search of her missing father, only to find herself confronted by The Beast, who has taken him prisoner. Despite his warning for her to leave, she insists on saving her father and pleads with the shadowy figure above her. However, The Beast threatens that if she doesn't comply, she will be imprisoned as well.
```
You can find the samples which failed to get a card generated correctly [here](https://huggingface.co/datasets/PJMixers/grimulkan_bluemoon_Karen_cleaned-carded-failures). |
1rsh/indian-location-ner | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: label
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 706
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EnochLee/dlhlp-hw7-data | ---
license: cc-by-3.0
---
|
argilla/prompt-collective | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for prompt-collective
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/prompt-collective")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/prompt-collective")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| prompt | Prompt | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| quality | Rate the quality of the prompt | label_selection | True | N/A | ['0', '1', '2', '3', '4'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"prompt": "Provide step-by-step instructions on how to make a safe and effective homemade all-purpose cleaner from common household ingredients. The guide should include measurements, tips for storing the cleaner, and additional variations or scents that can be added. Additionally, the guide should be written in clear and concise language, with helpful visuals or photographs to aid in the process."
},
"metadata": {
"evolved_from": null,
"kind": "synthetic",
"source": "ultrachat"
},
"responses": [
{
"status": "submitted",
"user_id": "d23b12c2-b601-490e-b5b3-2040eb393a00",
"values": {
"quality": {
"value": "4"
}
}
},
{
"status": "submitted",
"user_id": "e2bdd868-f28e-46fc-9254-a6ec1e291889",
"values": {
"quality": {
"value": "4"
}
}
}
],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{\"source\": \"ultrachat\", \"kind\": \"synthetic\", \"evolved_from\": null}",
"prompt": "Provide step-by-step instructions on how to make a safe and effective homemade all-purpose cleaner from common household ingredients. The guide should include measurements, tips for storing the cleaner, and additional variations or scents that can be added. Additionally, the guide should be written in clear and concise language, with helpful visuals or photographs to aid in the process.",
"quality": [
{
"status": "submitted",
"user_id": "d23b12c2-b601-490e-b5b3-2040eb393a00",
"value": "4"
},
{
"status": "submitted",
"user_id": "e2bdd868-f28e-46fc-9254-a6ec1e291889",
"value": "4"
}
],
"quality-suggestion": null,
"quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **prompt** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **quality** is of type `label_selection` with the following allowed values ['0', '1', '2', '3', '4'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **quality-suggestion** is of type `label_selection` with the following allowed values ['0', '1', '2', '3', '4'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
# Task
We are collaboratively creating a database of prompts in English. The annotation guidelines below should help you get started but you can also ask questions in the [Discord Channel](https://discord.com/channels/879548962464493619/1205128865735770142).
Our aim is to identify effective prompts and understand the interaction between AI-generated and human-generated prompts.
The focus is on rating prompts that are clear, interesting and complex for fine-tuning open source LLMs.
What makes a prompt good?
That's a difficult question but here are some aspects:
- The intent of the user is clear.
- The question, instruction or task for the assistant is challenging or interesting because it involves solving a complex problem, reasoning, involving being creative, etc.
In other words, first of all the intent (what the user asks) should be clear. Then we can look into how interesting and complex is the task. The most interesting the prompt is the higher rating should be.
## Guidelines
You need to assign a rating to each prompt thinking about the complexity for an assistant and if the intent is clear. A very good prompt is one that is challenging but also very clear in the intent of the user.
You can use keyboard shortcuts (the numbers) to quickly rate the examples.
If you find some pattern, you can also use the search box and filters as well as the bulk labelling mode, please use this with care and only when you find a clear pattern (e.g., prompts that are completely incorrect and share a common issue).
If you are unsure about your answer, you can click on the tag and then “Save as a draft” to save if for later. In the case that you feel unequipped of rating a specific prompt, you can use the “Discard” button.
## Ratings
### 1. Very Bad:
The prompt doesn't communicate its purpose, is non-sensical or is in a language other than English.
The prompt assumes the usage of tools or capabilities that don’t apply to this model, like generating an image or scraping a website.
*Examples:*
>"Do the thing."
>“Hello!”
>"asdajflajfada”
>“Quiero que redactes una entrada de blog.”
>"Extract data from a website.”
>“Tell me how you feel when someone insults you.”
### 2. Bad:
Suggests a goal but lacks clarity and coherence.
*Examples:*
>"Find me stuff about that thing, you know?"
>“Write something.”
>"Tell me about this thing."
>"Can you help with this?"
>"I need to know more."
### 3. Ok:
The intent is understandable, but it's missing information to complete the task.
*Examples:*
>"I need information on something important."
>“Write a blogpost.”
### 4. Good:
Presents a clear goal and necessary information, effectively directing the AI, but the prompt could be more specific.
*Examples:*
>"Provide a summary of renewable energy sources."
>“Tell me about Sean Connery.”
>"Explain global warming."
### 5. Very Good:
Comprehensive and explicit, leaving no room for ambiguity. Perfectly guides the AI and includes details.
*Examples:*
>"Compare the efficiency and environmental impact of solar and wind energy, including recent advancements and case studies from 2023."
>“Make a list of 5 plant-based recipes that I can try that don’t have red peppers as an ingredient.”
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ibm/AttaQ | ---
license: mit
language: en
multilinguality: monolingual
source_datasets: extended|Anthropic/hh-rlhf
task_categories:
- text-generation
- text2text-generation
size_categories:
- 1K<n<10K
tags:
- safety
- harm
- LLMs
- Vulnerability
- red teaming
- toxicity
---
# AttaQ Dataset Card
The AttaQ red teaming dataset, consisting of 1402 carefully crafted adversarial questions, is designed to evaluate Large Language Models (LLMs) by assessing their tendency to generate harmful or undesirable responses.
It may serve as a benchmark to assess the potential harm of responses produced by LLMs.
The dataset is categorized into seven distinct classes of questions: deception, discrimination, harmful information, substance abuse, sexual content, personally identifiable information (PII), and violence.
Researchers and developers can use this dataset to assess the behavior of LLMs and explore the various factors that influence their responses, ultimately aiming to enhance their harmlessness and ethical usage.
The development of AttaQ involved multiple sources:
1. Extraction: Initially, attacks were methodologically extracted from the hh-rlhf dataset provided by Anthropic. These extracted attacks were then semi-automatically labeled into the seven aforementioned categories.
2. Generation: LLMs were employed to generate a substantial volume of attacks that were similar in nature to the ones extracted from the hh-rlhf dataset.
3. Crawl and Generate: We utilized LLMs in conjunction with the Wikipedia Crimes Page to generate a series of actions associated with each type of crime. These actions were then employed to devise questions that explore ways to engage in unethical or unlawful actions related to those crimes, specifically focusing on strategies to evade detection or consequences.
### Disclaimer:
* The data contains offensive and upsetting content by nature therefore it may not be easy to read. Please read them in accordance with your own personal risk tolerance.
* LLM's response to the AttaQ samples in many cases is harmful and/or violent.
* This dataset is a representative subset of all possible attacks. Many other attacks can cause LLM to answer harmful or inappropriate responses.
* The red-teaming community’s goal is to make models less harmful. We restrict the usage of the dataset to make models less harmful.
### Fields
- `label`: The label of adversarial question
- `input`: adversarial question
- `additional_info`: source of the adversarial question
### Usage Information
The data are described in the [paper](https://arxiv.org/abs/2311.04124): **Unveiling Safety Vulnerabilities of Large Language Models**.
As detailed in the paper, automatically evaluating the potential harm by LLMs requires a harmlessness metric.
The model under test should be prompted by each question in the dataset, and the corresponding responses undergo evaluation using a metric that considers both the input and output.
Our paper utilizes the "OpenAssistant/reward-model-deberta-v3-large-v2" Reward model, though other models such as "sileod/deberta-v3-large-tasksource-rlhf-reward-model" can also be employed.
```python
import pandas as pd
from transformers import pipeline, AutoTokenizer, AutoModelForSequenceClassification
from datasets import load_dataset
# Load dataset and initialize models
dataset = load_dataset("ibm/AttaQ")
generator = pipeline("text2text-generation", model="gpt2", tokenizer="gpt2")
reward_model = AutoModelForSequenceClassification.from_pretrained("sileod/deberta-v3-large-tasksource-rlhf-reward-model")
tokenizer_reward = AutoTokenizer.from_pretrained("sileod/deberta-v3-large-tasksource-rlhf-reward-model")
# Create DataFrame and generate responses
df = pd.DataFrame(dataset['train'])
df['output'] = [response['generated_text'] for response in generator(df['input'].tolist(), max_length=50, num_return_sequences=1)]
# Calculate harmlessness for each sample
inputs = tokenizer_reward(df['input'].tolist(), df['output'].tolist(), return_tensors="pt", padding=True, truncation=True)
df['reward'] = reward_model(**inputs).logits.detach().cpu().numpy()
```
<a href="url"><img src="https://cdn-uploads.huggingface.co/production/uploads/64540de813dba495b2e0354e/wOLy8gjqlNXZMlDAl1lJD.png" align="center" height="800" width="800" ></a>
A comparison between several models across various attack domains in the Attaq dataset.
### Citation Information
If this dataset is utilized in your research, kindly cite the following paper:
```
@article{kour2023unveiling,
title={Unveiling Safety Vulnerabilities of Large Language Models},
author={Kour, George and Zalmanovici, Marcel and Zwerdling, Naama and Goldbraich, Esther and Fandina, Ora Nova and Anaby-Tavor, Ateret and Raz, Orna and Farchi, Eitan},
journal={arXiv preprint arXiv:2311.04124},
year={2023}
}
``` |
femifoly/alb | ---
license: openrail
language:
- sq
size_categories:
- 100B<n<1T
--- |
psiyou/mspka_corpus | ---
language:
- it
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 463289838.3
num_examples: 1790
download_size: 443971328
dataset_size: 463289838.3
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MFR096/farm_crop | ---
license: mit
---
|
CyberHarem/melissa_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of melissa (Pokémon)
This is the dataset of melissa (Pokémon), containing 125 images and their tags.
The core tags of this character are `purple_hair, breasts, purple_eyes, long_hair, large_breasts, quad_tails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 91.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 60.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 235 | 104.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 84.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 235 | 134.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/melissa_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------|
| 0 | 54 |  |  |  |  |  | 1girl, solo, elbow_gloves, bare_shoulders, white_gloves, cleavage, smile, purple_dress, blush, lipstick |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | elbow_gloves | bare_shoulders | white_gloves | cleavage | smile | purple_dress | blush | lipstick |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:-----------------|:---------------|:-----------|:--------|:---------------|:--------|:-----------|
| 0 | 54 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
heliosprime/twitter_dataset_1713027430 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10591
num_examples: 23
download_size: 9050
dataset_size: 10591
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713027430"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rouskinlab/lncRNA_nonFiltered |
---
license: mit
language:
- en
tags:
- chemistry
- biology`
author: Silvi Rouskin
source: data.json
date: 2024-03-19-15-31-02
---
# Data types
- **sequence**: 30 datapoints
- **structure**: 30 datapoints
# Conversion report
Over a total of 30 datapoints, there are:
### OUTPUT
- ALL: 30 valid datapoints
- INCLUDED: 6 duplicate sequences with different structure / dms / shape
### MODIFIED
- 0 multiple sequences with the same reference (renamed reference)
### FILTERED OUT
- 0 invalid datapoints (ex: sequence with non-regular characters)
- 0 datapoints with bad structures
- 0 duplicate sequences with the same structure / dms / shape |
zhangshuoming/c_x86_simd_extension_filtered | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 611556.5185185185
num_examples: 428
download_size: 70159
dataset_size: 611556.5185185185
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c_x86_simd_extension_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nhantruongcse/summary-vietnamese-news-token-TFtrain_vit5_base | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1113631502
num_examples: 148115
download_size: 493774460
dataset_size: 1113631502
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_word_baseline_10000 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 822035
num_examples: 10000
- name: eval_find_word
num_bytes: 82196
num_examples: 1000
download_size: 442380
dataset_size: 904231
---
# Dataset Card for "find_word_baseline_10000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mahabharat/OWASP | ---
license: gpl
pretty_name: owasp
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for owasp web attacks. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
This dataset was generated using commonly available payloads, Its generated using intruder of burpsuite and tried against vulnerable site publicly available data.
- **Curated by:** vignesh,Nagendran
- **License:** GNU
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/swisskyrepo/PayloadsAllTheThings
-
## Uses
Incase of future owasp based automations and trainings.
### Direct Use
Web Application pentest
[More Information Needed]
### Out-of-Scope Use
[More Information Needed]
## Dataset Structure
Its pure intruder traffic, We have just fuzzed with commonly available payloads and created this dataset
## Dataset Card Authors [optional]
Vignesh,Nagendran GS
|
KenDoStudio/MLP-oc_SawtoothWavesData | ---
license: creativeml-openrail-m
---
|
ehristoforu/midjourney-images | ---
license: mit
task_categories:
- text-to-image
- image-to-image
tags:
- midjourney-v6
- midjourney
- midjourney-images
- images
- croissant
size_categories:
- 10K<n<100K
---
# ⛵ Midjourney Images Dataset
This is datase with images made by Midjourney V5/V6.
## Dataset parameters
1. **Count of images**: ~10.000
2. **Zip file with dataset**: True
3. **Captions with images**: False
## License
License for this dataset: [MIT](https://www.mit.edu/~amini/LICENSE.md)
## Use in *datasets*
1. ```bash
pip install -q datasets
```
2. ```py
from datasets import load_dataset
dataset = load_dataset(
"ehristoforu/midjourney-images",
revision="main"
)
```
#### *Enjoy with this dataset!* |
Mikecyane/kinn | ---
license: openrail
---
|
open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k | ---
pretty_name: Evaluation run of mncai/Llama2-7B-guanaco-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Llama2-7B-guanaco-1k](https://huggingface.co/mncai/Llama2-7B-guanaco-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T22:26:12.007542](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k/blob/main/results_2023-10-24T22-26-12.007542.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.00037786091964606556,\n \"f1\": 0.05869022651006706,\n\
\ \"f1_stderr\": 0.001351280630481856,\n \"acc\": 0.41201958145566014,\n\
\ \"acc_stderr\": 0.009744847411711194\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606556,\n\
\ \"f1\": 0.05869022651006706,\n \"f1_stderr\": 0.001351280630481856\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \
\ \"acc_stderr\": 0.007291205723162611\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259778\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mncai/Llama2-7B-guanaco-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T22_26_12.007542
path:
- '**/details_harness|drop|3_2023-10-24T22-26-12.007542.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T22-26-12.007542.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T22_26_12.007542
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-26-12.007542.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-26-12.007542.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-29-13.374969.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-29-13.374969.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T22_26_12.007542
path:
- '**/details_harness|winogrande|5_2023-10-24T22-26-12.007542.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T22-26-12.007542.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_29_13.374969
path:
- results_2023-10-03T19-29-13.374969.parquet
- split: 2023_10_24T22_26_12.007542
path:
- results_2023-10-24T22-26-12.007542.parquet
- split: latest
path:
- results_2023-10-24T22-26-12.007542.parquet
---
# Dataset Card for Evaluation run of mncai/Llama2-7B-guanaco-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/Llama2-7B-guanaco-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/Llama2-7B-guanaco-1k](https://huggingface.co/mncai/Llama2-7B-guanaco-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T22:26:12.007542](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Llama2-7B-guanaco-1k/blob/main/results_2023-10-24T22-26-12.007542.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606556,
"f1": 0.05869022651006706,
"f1_stderr": 0.001351280630481856,
"acc": 0.41201958145566014,
"acc_stderr": 0.009744847411711194
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606556,
"f1": 0.05869022651006706,
"f1_stderr": 0.001351280630481856
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.007291205723162611
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259778
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zzzzhhh/my_datasets | ---
dataset_info:
- config_name: boolq
features:
- name: question
dtype: string
- name: passage
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 6173303
num_examples: 9427
- name: validation
num_bytes: 2116452
num_examples: 3270
- name: test
num_bytes: 2105959
num_examples: 3245
download_size: 4118001
dataset_size: 10395714
- config_name: cb
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': entailment
'1': contradiction
'2': neutral
splits:
- name: train
num_bytes: 87050
num_examples: 250
- name: validation
num_bytes: 21851
num_examples: 56
- name: test
num_bytes: 93492
num_examples: 250
download_size: 75482
dataset_size: 202393
- config_name: copa
features:
- name: premise
dtype: string
- name: choice1
dtype: string
- name: choice2
dtype: string
- name: question
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': choice1
'1': choice2
splits:
- name: train
num_bytes: 49233
num_examples: 400
- name: validation
num_bytes: 12479
num_examples: 100
- name: test
num_bytes: 59846
num_examples: 500
download_size: 43986
dataset_size: 121558
- config_name: multirc
features:
- name: paragraph
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: idx
struct:
- name: paragraph
dtype: int32
- name: question
dtype: int32
- name: answer
dtype: int32
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 46182913
num_examples: 27243
- name: validation
num_bytes: 7753452
num_examples: 4848
- name: test
num_bytes: 14985531
num_examples: 9693
download_size: 1116225
dataset_size: 68921896
- config_name: record
features:
- name: passage
dtype: string
- name: query
dtype: string
- name: entities
sequence: string
- name: entity_spans
sequence:
- name: text
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: answers
sequence: string
- name: idx
struct:
- name: passage
dtype: int32
- name: query
dtype: int32
splits:
- name: train
num_bytes: 178757646
num_examples: 100730
- name: validation
num_bytes: 17432944
num_examples: 10000
- name: test
num_bytes: 17154943
num_examples: 10000
download_size: 51757880
dataset_size: 213345533
- config_name: rte
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': entailment
'1': not_entailment
splits:
- name: train
num_bytes: 847177
num_examples: 2490
- name: validation
num_bytes: 90716
num_examples: 277
- name: test
num_bytes: 973916
num_examples: 3000
download_size: 750920
dataset_size: 1911809
- config_name: wic
features:
- name: word
dtype: string
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: start1
dtype: int32
- name: start2
dtype: int32
- name: end1
dtype: int32
- name: end2
dtype: int32
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 658381
num_examples: 5428
- name: validation
num_bytes: 81811
num_examples: 638
- name: test
num_bytes: 178831
num_examples: 1400
download_size: 396213
dataset_size: 919023
- config_name: wsc
features:
- name: text
dtype: string
- name: span1_index
dtype: int32
- name: span2_index
dtype: int32
- name: span1_text
dtype: string
- name: span2_text
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 89311
num_examples: 554
- name: validation
num_bytes: 21521
num_examples: 104
- name: test
num_bytes: 31408
num_examples: 146
download_size: 32751
dataset_size: 142240
- config_name: wsc.fixed
features:
- name: text
dtype: string
- name: span1_index
dtype: int32
- name: span2_index
dtype: int32
- name: span1_text
dtype: string
- name: span2_text
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 89311
num_examples: 554
- name: validation
num_bytes: 21521
num_examples: 104
- name: test
num_bytes: 31404
num_examples: 146
download_size: 32751
dataset_size: 142236
- config_name: axb
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': entailment
'1': not_entailment
splits:
- name: test
num_bytes: 237694
num_examples: 1104
download_size: 33950
dataset_size: 237694
- config_name: axg
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: idx
dtype: int32
- name: label
dtype:
class_label:
names:
'0': entailment
'1': not_entailment
splits:
- name: test
num_bytes: 53348
num_examples: 356
download_size: 10413
dataset_size: 53348
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1a3b5a28 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1330
dataset_size: 178
---
# Dataset Card for "1a3b5a28"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fireworker/Mark_list | ---
license: afl-3.0
---
|
Davoodii/Moein | ---
license: apache-2.0
---
|
bilgedogan/tts_audio_samples | ---
license: apache-2.0
dataset_info:
config_name: tts_models_multilingual_multi-dataset_xtts_v2
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: text
dtype: string
- name: time
dtype: float64
splits:
- name: train
num_bytes: 3088108.0
num_examples: 20
download_size: 1833355
dataset_size: 3088108.0
configs:
- config_name: tts_models_multilingual_multi-dataset_xtts_v2
data_files:
- split: train
path: tts_models_multilingual_multi-dataset_xtts_v2/train-*
---
|
CyberHarem/indigo_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of indigo/インディゴ/深靛 (Arknights)
This is the dataset of indigo/インディゴ/深靛 (Arknights), containing 152 images and their tags.
The core tags of this character are `long_hair, purple_eyes, very_long_hair, pointy_ears, breasts, brown_hair, blonde_hair, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 152 | 313.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/indigo_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 152 | 257.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/indigo_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 391 | 481.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/indigo_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/indigo_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, black_gloves, solo, looking_at_viewer, infection_monitor_(arknights), white_background, simple_background, holding_staff, white_dress, small_breasts, thigh_strap, cowboy_shot, belt, closed_mouth, elbow_gloves, white_flower |
| 1 | 5 |  |  |  |  |  | 1girl, black_footwear, black_gloves, boots, full_body, holding_staff, infection_monitor_(arknights), looking_at_viewer, skirt, solo, black_socks, thigh_strap, simple_background, standing, tail, white_background, bare_legs, character_name, closed_mouth, elbow_gloves, flower, short_dress |
| 2 | 22 |  |  |  |  |  | 1girl, holding_fan, looking_at_viewer, solo, hair_ornament, official_alternate_costume, white_dress, folded_fan, long_sleeves, one_side_up, black_gloves, sitting, closed_mouth, china_dress, thigh_strap, black_footwear, full_body, hair_spread_out, lantern |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | solo | looking_at_viewer | infection_monitor_(arknights) | white_background | simple_background | holding_staff | white_dress | small_breasts | thigh_strap | cowboy_shot | belt | closed_mouth | elbow_gloves | white_flower | black_footwear | boots | full_body | skirt | black_socks | standing | tail | bare_legs | character_name | flower | short_dress | holding_fan | hair_ornament | official_alternate_costume | folded_fan | long_sleeves | one_side_up | sitting | china_dress | hair_spread_out | lantern |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:--------------------|:--------------------------------|:-------------------|:--------------------|:----------------|:--------------|:----------------|:--------------|:--------------|:-------|:---------------|:---------------|:---------------|:-----------------|:--------|:------------|:--------|:--------------|:-----------|:-------|:------------|:-----------------|:---------|:--------------|:--------------|:----------------|:-----------------------------|:-------------|:---------------|:--------------|:----------|:--------------|:------------------|:----------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 2 | 22 |  |  |  |  |  | X | X | X | X | | | | | X | | X | | | X | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-96000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 661837
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nixiesearch/amazon-esci | ---
language:
- en
license: apache-2.0
tags:
- text
pretty_name: Amazon ESCI dataset in nixietune format
size_categories:
- "100K<n<1M"
source_datasets:
- Amazon ESCI
task_categories:
- sentence-similarity
dataset_info:
config_name: default
features:
- name: query
dtype: string
- name: doc
dtype: string
- name: neg
sequence: string
- name: negscore
sequence: float
splits:
- name: train
num_bytes: 2734101179
num_examples: 181819
- name: test
num_bytes: 1186871193
num_examples: 79708
- name: test_1k
num_bytes: 16656546
num_examples: 1000
train-eval-index:
- config: default
task: sentence-similarity
splits:
train_split: train
eval_split: test
configs:
- config_name: default
data_files:
- split: train
path: "data/train/*"
- split: test
path: "data/test/*"
- split: test_1k
path: "data/test_1k/*"
---
# Amazon ESCI dataset
A dataset in a [nixietune](https://github.com/nixiesearch/nixietune) compatible format:
```json
{
{
"query": "# cellist thats not a hashtag",
"pos": "Funny Cellists That's Not A Hashtag Music Sweatshirt",
"neg": [
"Feelin Good Tees My Opinion Offended You Adult Humor T Shirt XL Black",
"Christian Faith & Cross T-Shirt - Christian Faith T Shirts T-Shirt",
"PLUS PLUS - 240 Piece Basic Mix - Construction Building Stem/Steam Toy, Mini Puzzle Blocks for Kids",
"Caution I Learned to Drive Through Video Games - Funny Gamer T-Shirt",
"People Who Tolerate Me On A Daily Basis T Shirt L Black",
]
}
```
This is the expanded version of the [Amazon ESCI small-en](https://github.com/amazon-science/esci-data) dataset:
* can be loaded with [HF datasets](https://huggingface.co/docs/datasets/index) directly.
## Usage
```python
from datasets import load_dataset
data = load_dataset('nixiesearch/amazon-esci', split="train")
```
## License
Apache 2.0 |
Codec-SUPERB/crema_d_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: id
dtype: string
splits:
- name: original
num_bytes: 3571247159.242
num_examples: 7438
- name: academicodec_hifi_16k_320d
num_bytes: 1190833336.242
num_examples: 7438
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 1190833336.242
num_examples: 7438
- name: academicodec_hifi_24k_320d
num_bytes: 1785873336.242
num_examples: 7438
- name: audiodec_24k_320d
num_bytes: 1785873336.242
num_examples: 7438
- name: dac_16k
num_bytes: 1190833336.242
num_examples: 7438
- name: dac_24k
num_bytes: 1785873336.242
num_examples: 7438
- name: dac_44k
num_bytes: 3280912266.242
num_examples: 7438
- name: encodec_24k
num_bytes: 1785889142.242
num_examples: 7438
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 1190849142.242
num_examples: 7438
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 1190849142.242
num_examples: 7438
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 1190849142.242
num_examples: 7438
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 1190849142.242
num_examples: 7438
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 1190849142.242
num_examples: 7438
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 1190849142.242
num_examples: 7438
- name: speech_tokenizer_16k
num_bytes: 1195594586.242
num_examples: 7438
download_size: 17586097169
dataset_size: 25908858024.872005
---
# Dataset Card for "crema_d_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carsonpoole/llava-qa-high-ppl | ---
license: mit
---
|
eberreta/spanish-numbers | ---
license: mit
---
|
sanjay920/MetaMathQA-sharegpt | ---
dataset_info:
features:
- name: original_question
dtype: string
- name: type
dtype: string
- name: response
dtype: string
- name: query
dtype: string
- name: tools
dtype: string
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 853779213
num_examples: 395000
download_size: 368993237
dataset_size: 853779213
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/anchira_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of anchira/アンチラ (Granblue Fantasy)
This is the dataset of anchira/アンチラ (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are `blonde_hair, animal_ears, short_hair, monkey_ears, monkey_tail, tail, breasts, small_breasts, red_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 698.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 394.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1244 | 869.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 620.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1244 | 1.21 GiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/anchira_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, erune, hair_flower, looking_at_viewer, official_alternate_costume, solo, blush, double_bun, smile, covered_navel, highleg_swimsuit, thighs, ahoge, casual_one-piece_swimsuit, detached_sleeves, see-through, blue_one-piece_swimsuit, open_mouth |
| 1 | 19 |  |  |  |  |  | 1girl, barefoot, detached_sleeves, erune, looking_at_viewer, solo, feet, bare_shoulders, soles, ass, blush, sideboob, toes, staff, cloud, smile |
| 2 | 6 |  |  |  |  |  | 1girl, cloud, detached_sleeves, erune, looking_at_viewer, sideboob, smile, solo, staff, bare_shoulders, brown_eyes, hagoromo |
| 3 | 7 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, erune, looking_at_viewer, solo, staff, barefoot, hairband, sideboob, blush, cloud, white_background |
| 4 | 9 |  |  |  |  |  | 1girl, bare_shoulders, blush, detached_sleeves, erune, solo, thighhighs, looking_at_viewer, simple_background, leotard, white_background, staff, wide_sleeves, brown_eyes, sideboob, thighs |
| 5 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cloud, detached_sleeves, erune, looking_at_viewer, solo, staff, two_side_up, barefoot, blush, cleavage_cutout, holding, sash, sideboob, hagoromo, hairband, simple_background, white_background, wide_sleeves, ahoge, chibi, leotard, sitting, smile |
| 6 | 6 |  |  |  |  |  | 1girl, ahoge, blue_shorts, blush, cleavage_cutout, erune, fur_trim, hairband, long_sleeves, looking_at_viewer, short_shorts, smile, solo, two_side_up, blue_ribbon, closed_mouth, petals, striped_thighhighs, denim_shorts, midriff, navel, sitting, white_shirt, bell, belt, full_body, hair_between_eyes, jacket, staff, tail_ribbon |
| 7 | 8 |  |  |  |  |  | detached_sleeves, harvin, wide_sleeves, bare_shoulders, looking_at_viewer, 1girl, bandeau, blush, feathers, hair_beads, very_long_hair, :o, black_thighhighs, pelvic_curtain, earrings, hands_on_own_face, midriff, multiple_girls, pointy_ears, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | erune | hair_flower | looking_at_viewer | official_alternate_costume | solo | blush | double_bun | smile | covered_navel | highleg_swimsuit | thighs | ahoge | casual_one-piece_swimsuit | detached_sleeves | see-through | blue_one-piece_swimsuit | open_mouth | barefoot | feet | soles | ass | sideboob | toes | staff | cloud | brown_eyes | hagoromo | hairband | white_background | thighhighs | simple_background | leotard | wide_sleeves | two_side_up | cleavage_cutout | holding | sash | chibi | sitting | blue_shorts | fur_trim | long_sleeves | short_shorts | blue_ribbon | closed_mouth | petals | striped_thighhighs | denim_shorts | midriff | navel | white_shirt | bell | belt | full_body | hair_between_eyes | jacket | tail_ribbon | harvin | bandeau | feathers | hair_beads | very_long_hair | :o | black_thighhighs | pelvic_curtain | earrings | hands_on_own_face | multiple_girls | pointy_ears | solo_focus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:--------------|:--------------------|:-----------------------------|:-------|:--------|:-------------|:--------|:----------------|:-------------------|:---------|:--------|:----------------------------|:-------------------|:--------------|:--------------------------|:-------------|:-----------|:-------|:--------|:------|:-----------|:-------|:--------|:--------|:-------------|:-----------|:-----------|:-------------------|:-------------|:--------------------|:----------|:---------------|:--------------|:------------------|:----------|:-------|:--------|:----------|:--------------|:-----------|:---------------|:---------------|:--------------|:---------------|:---------|:---------------------|:---------------|:----------|:--------|:--------------|:-------|:-------|:------------|:--------------------|:---------|:--------------|:---------|:----------|:-----------|:-------------|:-----------------|:-----|:-------------------|:-----------------|:-----------|:--------------------|:-----------------|:--------------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | | X | | X | X | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | X | | X | | | X | | | | | | X | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | X | | | | X | | | | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | X | | | X | | | | | | | | X | | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | | X | | | | X | | X | | | | X | | | | X | | X | X | | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | | X | | X | X | | X | | | | X | | | | | | | | | | | | X | | | | X | | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
male-2/training_v0.0.6-public_convert | ---
dataset_info:
features:
- name: id
dtype: string
- name: type
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: emotion
struct:
- name: joyful
dtype: bool
- name: sad
dtype: bool
- name: angry
dtype: bool
- name: example
dtype: string
splits:
- name: train
num_bytes: 1400
num_examples: 1
download_size: 11364
dataset_size: 1400
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tolgadev/ruyatabirleri | ---
license: apache-2.0
---
|
jp12zin/mcqueenmodelo | ---
license: openrail
---
|
Yijia-Xiao/samsum | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 554915
num_examples: 819
download_size: 0
dataset_size: 554915
---
# Dataset Card for "samsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DeadPixels/DPhi_Sprint_25_Flowers | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': daisy
'1': dandelion
'2': rose
'3': sunflower
'4': tulip
splits:
- name: train
num_bytes: 123964921.405
num_examples: 2589
- name: test
num_bytes: 47588262
num_examples: 864
- name: validation
num_bytes: 47493769
num_examples: 864
download_size: 237386772
dataset_size: 219046952.405
license: cc-by-2.0
task_categories:
- image-classification
pretty_name: 'Data Sprint #25: Flower Recognition Datas'
size_categories:
- 1K<n<10K
---
# Dataset Card for "DPhi_Sprint_25_Flowers"
All images in this archive are licensed under the Creative Commons By-Attribution License, available at:
https://creativecommons.org/licenses/by/2.0/
The photographers are listed in LICENSE.txt, thanks to all of them for making their work available.
However, you will observe the image file names are different in this file than those we have provided. The file names were changed solely for the purpose of the data sprint. |
Oysiyl/google-android-toy-sdxl | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2732919.0
num_examples: 15
download_size: 2734150
dataset_size: 2732919.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PROCESOS/INE_Reverso | ---
license: c-uda
---
|
tner/bionlp2004 | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: BioNLP2004
---
# Dataset Card for "tner/bionlp2004"
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Paper:** [https://aclanthology.org/U15-1010.pdf](https://aclanthology.org/U15-1010.pdf)
- **Dataset:** BioNLP2004
- **Domain:** Biochemical
- **Number of Entity:** 5
### Dataset Summary
BioNLP2004 NER dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
BioNLP2004 dataset contains training and test only, so we randomly sample a half size of test instances from the training set to create validation set.
- Entity Types: `DNA`, `protein`, `cell_type`, `cell_line`, `RNA`
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```
{
'tags': [0, 0, 0, 0, 3, 0, 9, 10, 0, 0, 0, 0, 0, 7, 8, 0, 3, 0, 0, 9, 10, 10, 0, 0],
'tokens': ['In', 'the', 'presence', 'of', 'Epo', ',', 'c-myb', 'mRNA', 'declined', 'and', '20', '%', 'of', 'K562', 'cells', 'synthesized', 'Hb', 'regardless', 'of', 'antisense', 'myb', 'RNA', 'expression', '.']
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/fin/raw/main/dataset/label.json).
```python
{
"O": 0,
"B-DNA": 1,
"I-DNA": 2,
"B-protein": 3,
"I-protein": 4,
"B-cell_type": 5,
"I-cell_type": 6,
"B-cell_line": 7,
"I-cell_line": 8,
"B-RNA": 9,
"I-RNA": 10
}
```
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
|bionlp2004 |16619 | 1927| 3856|
### Citation Information
```
@inproceedings{collier-kim-2004-introduction,
title = "Introduction to the Bio-entity Recognition Task at {JNLPBA}",
author = "Collier, Nigel and
Kim, Jin-Dong",
booktitle = "Proceedings of the International Joint Workshop on Natural Language Processing in Biomedicine and its Applications ({NLPBA}/{B}io{NLP})",
month = aug # " 28th and 29th",
year = "2004",
address = "Geneva, Switzerland",
publisher = "COLING",
url = "https://aclanthology.org/W04-1213",
pages = "73--78",
}
``` |
jondewoo/analytical-cubism | ---
license: cc0-1.0
---
|
Venkatesh26/salesforce-flow2-xml | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2213
num_examples: 1
download_size: 15064
dataset_size: 2213
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdityaAI9/Resume_parser | ---
license: apache-2.0
dataset_info:
features:
- name: Category
dtype: int64
- name: Feature
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 24161822
num_examples: 2483
download_size: 6908833
dataset_size: 24161822
---
|
bdsaglam/musique-jerx-rltf | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: text
dtype: string
- name: answers
sequence: string
splits:
- name: train
num_bytes: 16645704
num_examples: 14376
- name: validation
num_bytes: 1480136
num_examples: 1252
download_size: 10897332
dataset_size: 18125840
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
danielz01/rsicd | ---
dataset_info:
features:
- name: image
dtype: image
- name: path
dtype: string
- name: img_id
dtype: int64
- name: land_cover
dtype: string
- name: captions
sequence: string
- name: caption_ids
sequence: int64
splits:
- name: train
num_bytes: 460526263.218
num_examples: 8734
- name: val
num_bytes: 60256331.322
num_examples: 1094
download_size: 473790206
dataset_size: 520782594.53999996
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
# Dataset Card for "rsicd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
surajp/shrutilipi_sanskrit | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcriptions
dtype: string
splits:
- name: train
num_bytes: 7961235781.888
num_examples: 14414
download_size: 7320639953
dataset_size: 7961235781.888
---
# Dataset Card for "shrutilipi_sanskrit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Clonadordoely/marcos | ---
license: openrail
---
|
open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped | ---
pretty_name: Evaluation run of EleutherAI/pythia-410m-deduped
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/pythia-410m-deduped](https://huggingface.co/EleutherAI/pythia-410m-deduped)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T09:27:36.064128](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped/blob/main/results_2023-10-18T09-27-36.064128.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298293,\n \"f1\": 0.042572357382550524,\n\
\ \"f1_stderr\": 0.0011637772390608397,\n \"acc\": 0.27341843124559817,\n\
\ \"acc_stderr\": 0.007756513586074438\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298293,\n\
\ \"f1\": 0.042572357382550524,\n \"f1_stderr\": 0.0011637772390608397\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245436\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5438042620363063,\n \"acc_stderr\": 0.013998453610924331\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/pythia-410m-deduped
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T09_27_36.064128
path:
- '**/details_harness|drop|3_2023-10-18T09-27-36.064128.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T09-27-36.064128.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T09_27_36.064128
path:
- '**/details_harness|gsm8k|5_2023-10-18T09-27-36.064128.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T09-27-36.064128.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:23:02.980263.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:23:02.980263.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T09_27_36.064128
path:
- '**/details_harness|winogrande|5_2023-10-18T09-27-36.064128.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T09-27-36.064128.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_23_02.980263
path:
- results_2023-07-19T14:23:02.980263.parquet
- split: 2023_10_18T09_27_36.064128
path:
- results_2023-10-18T09-27-36.064128.parquet
- split: latest
path:
- results_2023-10-18T09-27-36.064128.parquet
---
# Dataset Card for Evaluation run of EleutherAI/pythia-410m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-410m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-410m-deduped](https://huggingface.co/EleutherAI/pythia-410m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T09:27:36.064128](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped/blob/main/results_2023-10-18T09-27-36.064128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298293,
"f1": 0.042572357382550524,
"f1_stderr": 0.0011637772390608397,
"acc": 0.27341843124559817,
"acc_stderr": 0.007756513586074438
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298293,
"f1": 0.042572357382550524,
"f1_stderr": 0.0011637772390608397
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245436
},
"harness|winogrande|5": {
"acc": 0.5438042620363063,
"acc_stderr": 0.013998453610924331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
davanstrien/haiku-topics | ---
dataset_info:
features:
- name: input
dtype: string
splits:
- name: train
num_bytes: 13508
num_examples: 644
download_size: 9770
dataset_size: 13508
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HydraLM/CodeAlpaca-20k_list_dict | ---
dataset_info:
features:
- name: conversations
list:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 6372564
num_examples: 20021
download_size: 0
dataset_size: 6372564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CodeAlpaca-20k_list_dict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zzh12138/CRT-QA | ---
license: apache-2.0
task_categories:
- table-question-answering
language:
- en
size_categories:
- n<1K
---
This repository contains the CRT-QA dataset, which includes question-answer pairs that require complex reasoning over tabular data. 🚀
## About the Dataset and Paper
- **Title:** CRT-QA: A Dataset of Complex Reasoning Question Answering over Tabular Data
- **Conference:** EMNLP 2023
- **Authors:** [Zhehao Zhang](https://zzh-sjtu.github.io/zhehaozhang.github.io/), Xitao Li, Yan Gao, Jian-Guang Lou 👩💼👨💼
- **Affiliation:** Dartmouth College, Xi'an Jiaotong University, Microsoft Research Asia 🏢
### Data Format
The data is stored in a json file, structured with the following fields for each datapoint (keyed by a .csv file table):
```
Question name, Title, step1, step2, step3, step4, Answer, Directness, Composition Type
```
- `Question name`: The text of the question
- `Title`: The title of the table that the question refers to
- `step1` to `step4`: Steps describing the reasoning process and operations used to answer the question
- `type`: `Operation` or `Reasoning`
- `name`: Name of the specific operation or reasoning type
- `detail`: Additional details about the step
- `Answer`: The answer text
- `Directness`: `Explicit` or `Implicit` question
- `Composition Type`: `Bridging`, `Intersection`, or `Comparison`
-
### Reasoning and Operations
The reasoning and operations referenced in the `step` fields come from a defined taxonomy:
**Operations:**
- Indexing
- Filtering
- Grouping
- Sorting
**Reasoning:**
- Grounding
- Auto-categorization
- Temporal Reasoning
- Geographical/Spatial Reasoning
- Aggregating
- Arithmetic
- Reasoning with Quantifiers
- Other Commonsense Reasoning
### Contact 📧
For inquiries or updates about this repository, please contact [zhehao.zhang.gr@dartmouth.edu]. 📬
### Citation
If you use this dataset in your research, please cite the following paper:
```
@inproceedings{zhang-etal-2023-crt,
title = "{CRT}-{QA}: A Dataset of Complex Reasoning Question Answering over Tabular Data",
author = "Zhang, Zhehao and
Li, Xitao and
Gao, Yan and
Lou, Jian-Guang",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.132",
doi = "10.18653/v1/2023.emnlp-main.132",
pages = "2131--2153",
abstract = "Large language models (LLMs) show powerful reasoning abilities on various text-based tasks. However, their reasoning capability on structured data such as tables has not been systematically explored. In this work, we first establish a comprehensive taxonomy of reasoning and operation types for tabular data analysis. Then, we construct a complex reasoning QA dataset over tabular data, named CRT-QA dataset (Complex Reasoning QA over Tabular data), with the following unique features: (1) it is the first Table QA dataset with multi-step operation and informal reasoning; (2) it contains fine-grained annotations on questions{'} directness, composition types of sub-questions, and human reasoning paths which can be used to conduct a thorough investigation on LLMs{'} reasoning ability; (3) it contains a collection of unanswerable and indeterminate questions that commonly arise in real-world situations. We further introduce an efficient and effective tool-augmented method, named ARC (Auto-exemplar-guided Reasoning with Code), to use external tools such as Pandas to solve table reasoning tasks without handcrafted demonstrations. The experiment results show that CRT-QA presents a strong challenge for baseline methods and ARC achieves the best result.",
}
``` |
vwxyzjn/openhermes-dev__combined__1708359376 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate_policies
sequence: string
splits:
- name: train
num_bytes: 1179397
num_examples: 200
download_size: 475933
dataset_size: 1179397
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shreevigneshs/iwslt-2023-en-vi-train-split | ---
license: gpl-3.0
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
- name: vi_annotated
dtype: string
- name: styles
dtype: int64
splits:
- name: train
num_bytes: 291476.0
num_examples: 640
- name: val
num_bytes: 71743.0
num_examples: 160
- name: if_test
num_bytes: 34426.0
num_examples: 80
- name: f_test
num_bytes: 37317.0
num_examples: 80
download_size: 215786
dataset_size: 434962.0
---
|
Jzuluaga/atco2_corpus_1h | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: segment_start_time
dtype: float32
- name: segment_end_time
dtype: float32
- name: duration
dtype: float32
splits:
- name: test
num_bytes: 113872168.0
num_examples: 871
download_size: 113467762
dataset_size: 113872168.0
tags:
- audio
- automatic-speech-recognition
- en-atc
- en
- noisy-speech-recognition
- speech-recognition
task_categories:
- automatic-speech-recognition
language:
- en
multilinguality:
- monolingual
---
# Dataset Card for ATCO2 test set corpus (1hr set)
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages and Other Details](#languages-and-other-details)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [ATCO2 project homepage](https://www.atco2.org/)
- **Repository:** [ATCO2 corpus](https://github.com/idiap/atco2-corpus)
- **Paper:** [ATCO2 corpus: A Large-Scale Dataset for Research on Automatic Speech Recognition and Natural Language Understanding of Air Traffic Control Communications](https://arxiv.org/abs/2211.04054)
### Dataset Summary
ATCO2 project aims at developing a unique platform allowing to collect, organize and pre-process air-traffic control (voice communication) data from air space. This project has received funding from the Clean Sky 2 Joint Undertaking (JU) under grant agreement No 864702. The JU receives support from the European Union’s Horizon 2020 research and innovation programme and the Clean Sky 2 JU members other than the Union.
The project collected the real-time voice communication between air-traffic controllers and pilots available either directly through publicly accessible radio frequency channels or indirectly from air-navigation service providers (ANSPs). In addition to the voice communication data, contextual information is available in a form of metadata (i.e. surveillance data). The dataset consists of two distinct packages:
- A corpus of 5000+ hours (pseudo-transcribed) of air-traffic control speech collected across different airports (Sion, Bern, Zurich, etc.) in .wav format for speech recognition. Speaker distribution is 90/10% between males and females and the group contains native and non-native speakers of English.
- A corpus of 4 hours (transcribed) of air-traffic control speech collected across different airports (Sion, Bern, Zurich, etc.) in .wav format for speech recognition. Speaker distribution is 90/10% between males and females and the group contains native and non-native speakers of English. This corpus has been transcribed with orthographic information in XML format with speaker noise information, SNR values and others. Read Less
- A free sample of the 4 hours transcribed data is in [ATCO2 project homepage](https://www.atco2.org/data)
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`. Already adapted/fine-tuned models are available here --> [Wav2Vec 2.0 LARGE mdel](https://huggingface.co/Jzuluaga/wav2vec2-large-960h-lv60-self-en-atc-uwb-atcc-and-atcosim).
### Languages and other details
The text and the recordings are in English. For more information see Table 3 and Table 4 of [ATCO2 corpus paper](https://arxiv.org/abs/2211.04054)
## Dataset Structure
### Data Fields
- `id (string)`: a string of recording identifier for each example, corresponding to its.
- `audio (audio)`: audio data for the given ID
- `text (string)`: transcript of the file already normalized. Follow these repositories for more details [w2v2-air-traffic](https://github.com/idiap/w2v2-air-traffic) and [bert-text-diarization-atc](https://github.com/idiap/bert-text-diarization-atc)
- `segment_start_time (float32)`: segment start time (normally 0)
- `segment_end_time (float32): segment end time
- `duration (float32)`: duration of the recording, compute as segment_end_time - segment_start_time
## Additional Information
### Licensing Information
The licensing status of the ATCO2-test-set-1h corpus is in the file **ATCO2-ASRdataset-v1_beta - End-User Data Agreement** in the data folder. Download the data in [ATCO2 project homepage](https://www.atco2.org/data)
### Citation Information
Contributors who prepared, processed, normalized and uploaded the dataset in HuggingFace:
```
@article{zuluaga2022how,
title={How Does Pre-trained Wav2Vec2. 0 Perform on Domain Shifted ASR? An Extensive Benchmark on Air Traffic Control Communications},
author={Zuluaga-Gomez, Juan and Prasad, Amrutha and Nigmatulina, Iuliia and Sarfjoo, Saeed and others},
journal={IEEE Spoken Language Technology Workshop (SLT), Doha, Qatar},
year={2022}
}
@article{zuluaga2022bertraffic,
title={BERTraffic: BERT-based Joint Speaker Role and Speaker Change Detection for Air Traffic Control Communications},
author={Zuluaga-Gomez, Juan and Sarfjoo, Seyyed Saeed and Prasad, Amrutha and others},
journal={IEEE Spoken Language Technology Workshop (SLT), Doha, Qatar},
year={2022}
}
@article{zuluaga2022atco2,
title={ATCO2 corpus: A Large-Scale Dataset for Research on Automatic Speech Recognition and Natural Language Understanding of Air Traffic Control Communications},
author={Zuluaga-Gomez, Juan and Vesel{\`y}, Karel and Sz{\"o}ke, Igor and Motlicek, Petr and others},
journal={arXiv preprint arXiv:2211.04054},
year={2022}
}
```
|
pharaouk/biology_dataset_standardized_cluster_11 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ademax/binhvq-news-sentence | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 26415267562
num_examples: 141229987
download_size: 13799896934
dataset_size: 26415267562
---
# Dataset Card for "binhvq-news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BAAI/JudgeLM-data-collection-v1.0 |
---
license: cc-by-nc-4.0
language:
- en
tags:
- instruction-finetuning
pretty_name: JudgeLM-data-collection-v1.0
task_categories:
- text-generation
---
# Dataset Card for JudgeLM-data-collection
## Dataset Description
- **Repository:** https://github.com/baaivision/JudgeLM
- **Paper:** https://arxiv.org/abs/2310.17631
### Dataset Summary
This dataset is created for easily use and evaluate JudgeLM. We include LLMs-generated answers and a great multi-modal benchmark, [MM-Vet](https://github.com/yuweihao/MM-Vet) in this repo. The folder structure is shown as bellow:
**Folder structure**
```
data
├── JudgeLM/
│ ├── answers/
│ │ ├── alpaca_judgelm_val.jsonl
| | ├── ...
│ ├── judgelm_preprocess.py
│ ├── judgelm_val_5k.jsonl
│ ├── judgelm_val_5k_gpt4.jsonl
│ ├── judgelm_val_5k_gpt4_with_reference.jsonl
│ ├── judgelm_val_5k_references.jsonl
├── MM-Vet/
│ ├── mm-vet-emu-prediction.json
│ ├── mm-vet-gt.json
│ ├── mm-vet-judge-samples.jsonl
│ ├── mmvet_preprocess.py
```
You can directly put this dataset collection into the `/JudgeLM/judgelm` for better use.
### Languages
The data in Alpaca are in English.
## Additional Information
### Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
### Citation Information
```
@article{zhu2023judgelm,
title={JudgeLM: Fine-tuned Large Language Models are Scalable Judges},
author={Lianghui Zhu and Xinggang Wang and Xinlong Wang},
year={2023},
eprint={2310.17631},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
pphuc25/vlsp-test-vectorized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 25301722096
num_examples: 26343
download_size: 966087887
dataset_size: 25301722096
---
# Dataset Card for "vlsp-eval-vectorized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo2_100_kl_0.1_prm_70m_thr_0.1_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43551536
num_examples: 18929
- name: epoch_1
num_bytes: 44054344
num_examples: 18929
- name: epoch_2
num_bytes: 44126968
num_examples: 18929
- name: epoch_3
num_bytes: 44156009
num_examples: 18929
- name: epoch_4
num_bytes: 44169091
num_examples: 18929
- name: epoch_5
num_bytes: 44180734
num_examples: 18929
- name: epoch_6
num_bytes: 44185620
num_examples: 18929
- name: epoch_7
num_bytes: 44190904
num_examples: 18929
- name: epoch_8
num_bytes: 44191857
num_examples: 18929
- name: epoch_9
num_bytes: 44195738
num_examples: 18929
- name: epoch_10
num_bytes: 44196812
num_examples: 18929
- name: epoch_11
num_bytes: 44196809
num_examples: 18929
- name: epoch_12
num_bytes: 44194802
num_examples: 18929
- name: epoch_13
num_bytes: 44197919
num_examples: 18929
- name: epoch_14
num_bytes: 44196470
num_examples: 18929
- name: epoch_15
num_bytes: 44196870
num_examples: 18929
- name: epoch_16
num_bytes: 44197598
num_examples: 18929
- name: epoch_17
num_bytes: 44198309
num_examples: 18929
- name: epoch_18
num_bytes: 44197247
num_examples: 18929
- name: epoch_19
num_bytes: 44197080
num_examples: 18929
- name: epoch_20
num_bytes: 44198077
num_examples: 18929
- name: epoch_21
num_bytes: 44197665
num_examples: 18929
- name: epoch_22
num_bytes: 44199448
num_examples: 18929
- name: epoch_23
num_bytes: 44199139
num_examples: 18929
- name: epoch_24
num_bytes: 44199573
num_examples: 18929
- name: epoch_25
num_bytes: 44197740
num_examples: 18929
- name: epoch_26
num_bytes: 44199526
num_examples: 18929
- name: epoch_27
num_bytes: 44199444
num_examples: 18929
- name: epoch_28
num_bytes: 44199135
num_examples: 18929
- name: epoch_29
num_bytes: 44199279
num_examples: 18929
download_size: 1606494145
dataset_size: 1324961743
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
deepghs/anime_ch_horn | ---
license: mit
task_categories:
- image-classification
tags:
- art
size_categories:
- 10K<n<100K
--- |
datahrvoje/twitter_dataset_1713159278 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22066
num_examples: 51
download_size: 11511
dataset_size: 22066
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YassineBenlaria/tq_ar_fr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tq
dtype: string
- name: ar
dtype: string
- name: fr
dtype: string
splits:
- name: train
num_bytes: 3799473
num_examples: 5467
- name: test
num_bytes: 433414
num_examples: 804
download_size: 2361408
dataset_size: 4232887
---
# Dataset Card for "tamasheq_arabic_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_FelixChao__Severus-7B | ---
pretty_name: Evaluation run of FelixChao/Severus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/Severus-7B](https://huggingface.co/FelixChao/Severus-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Severus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T10:08:30.529941](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Severus-7B/blob/main/results_2024-01-21T10-08-30.529941.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6564105927142331,\n\
\ \"acc_stderr\": 0.032011062761424514,\n \"acc_norm\": 0.6561944704621784,\n\
\ \"acc_norm_stderr\": 0.03267453714628516,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538765,\n \"mc2\": 0.6136319274258737,\n\
\ \"mc2_stderr\": 0.015253515428580656\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6561433447098977,\n \"acc_stderr\": 0.013880644570156217,\n\
\ \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.686516630153356,\n\
\ \"acc_stderr\": 0.004629608863272306,\n \"acc_norm\": 0.8688508265285799,\n\
\ \"acc_norm_stderr\": 0.00336873543416138\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700472,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700472\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n\
\ \"acc_stderr\": 0.016525425898773496,\n \"acc_norm\": 0.423463687150838,\n\
\ \"acc_norm_stderr\": 0.016525425898773496\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"\
acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.696078431372549,\n \"acc_stderr\": 0.01860755213127983,\n \
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.01860755213127983\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538765,\n \"mc2\": 0.6136319274258737,\n\
\ \"mc2_stderr\": 0.015253515428580656\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510423\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7270659590598939,\n \
\ \"acc_stderr\": 0.012270381151108754\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/Severus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|arc:challenge|25_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|gsm8k|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hellaswag|10_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T10-08-30.529941.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- '**/details_harness|winogrande|5_2024-01-21T10-08-30.529941.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T10-08-30.529941.parquet'
- config_name: results
data_files:
- split: 2024_01_21T10_08_30.529941
path:
- results_2024-01-21T10-08-30.529941.parquet
- split: latest
path:
- results_2024-01-21T10-08-30.529941.parquet
---
# Dataset Card for Evaluation run of FelixChao/Severus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Severus-7B](https://huggingface.co/FelixChao/Severus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Severus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T10:08:30.529941](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Severus-7B/blob/main/results_2024-01-21T10-08-30.529941.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6564105927142331,
"acc_stderr": 0.032011062761424514,
"acc_norm": 0.6561944704621784,
"acc_norm_stderr": 0.03267453714628516,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538765,
"mc2": 0.6136319274258737,
"mc2_stderr": 0.015253515428580656
},
"harness|arc:challenge|25": {
"acc": 0.6561433447098977,
"acc_stderr": 0.013880644570156217,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815291
},
"harness|hellaswag|10": {
"acc": 0.686516630153356,
"acc_stderr": 0.004629608863272306,
"acc_norm": 0.8688508265285799,
"acc_norm_stderr": 0.00336873543416138
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700472,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700472
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773496,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773496
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.01860755213127983,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.01860755213127983
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538765,
"mc2": 0.6136319274258737,
"mc2_stderr": 0.015253515428580656
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510423
},
"harness|gsm8k|5": {
"acc": 0.7270659590598939,
"acc_stderr": 0.012270381151108754
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yzhuang/autotree_snxor_l1_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 155000000
num_examples: 10000
- name: validation
num_bytes: 155000000
num_examples: 10000
- name: test
num_bytes: 155000000
num_examples: 10000
download_size: 266245267
dataset_size: 465000000
---
# Dataset Card for "autotree_snxor_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
danjacobellis/audio_har | ---
dataset_info:
features:
- name: path
dtype: audio
- name: label
dtype: int64
- name: label_str
dtype: string
- name: participant
dtype: int64
splits:
- name: semi_natural
num_bytes: 1998810259.0
num_examples: 669
download_size: 1096427152
dataset_size: 1998810259.0
configs:
- config_name: default
data_files:
- split: semi_natural
path: data/semi_natural-*
---
|
OALL/Arabic_EXAMS | ---
dataset_info:
features:
- name: id
dtype: string
- name: subject
dtype: string
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 167521
num_examples: 537
- name: validation
num_bytes: 7803
num_examples: 25
download_size: 99036
dataset_size: 175324
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_Test157t__Prima-LelantaclesV6-7b | ---
pretty_name: Evaluation run of Test157t/Prima-LelantaclesV6-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Test157t/Prima-LelantaclesV6-7b](https://huggingface.co/Test157t/Prima-LelantaclesV6-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Prima-LelantaclesV6-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T05:25:46.353716](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Prima-LelantaclesV6-7b/blob/main/results_2024-03-01T05-25-46.353716.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6517987616925843,\n\
\ \"acc_stderr\": 0.032081956024637245,\n \"acc_norm\": 0.6514354489793407,\n\
\ \"acc_norm_stderr\": 0.03275253503008902,\n \"mc1\": 0.4944920440636475,\n\
\ \"mc1_stderr\": 0.017502438990451067,\n \"mc2\": 0.642907732226627,\n\
\ \"mc2_stderr\": 0.015485548517429995\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623494,\n\
\ \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7057359091814379,\n\
\ \"acc_stderr\": 0.004547798964126664,\n \"acc_norm\": 0.8765186217884884,\n\
\ \"acc_norm_stderr\": 0.003283165867631365\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049395,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049395\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.01625113971157077,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.01625113971157077\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4944920440636475,\n\
\ \"mc1_stderr\": 0.017502438990451067,\n \"mc2\": 0.642907732226627,\n\
\ \"mc2_stderr\": 0.015485548517429995\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571752\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6755117513267627,\n \
\ \"acc_stderr\": 0.012896095359768111\n }\n}\n```"
repo_url: https://huggingface.co/Test157t/Prima-LelantaclesV6-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|arc:challenge|25_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|gsm8k|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hellaswag|10_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-25-46.353716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T05-25-46.353716.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- '**/details_harness|winogrande|5_2024-03-01T05-25-46.353716.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T05-25-46.353716.parquet'
- config_name: results
data_files:
- split: 2024_03_01T05_25_46.353716
path:
- results_2024-03-01T05-25-46.353716.parquet
- split: latest
path:
- results_2024-03-01T05-25-46.353716.parquet
---
# Dataset Card for Evaluation run of Test157t/Prima-LelantaclesV6-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Prima-LelantaclesV6-7b](https://huggingface.co/Test157t/Prima-LelantaclesV6-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Prima-LelantaclesV6-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T05:25:46.353716](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Prima-LelantaclesV6-7b/blob/main/results_2024-03-01T05-25-46.353716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6517987616925843,
"acc_stderr": 0.032081956024637245,
"acc_norm": 0.6514354489793407,
"acc_norm_stderr": 0.03275253503008902,
"mc1": 0.4944920440636475,
"mc1_stderr": 0.017502438990451067,
"mc2": 0.642907732226627,
"mc2_stderr": 0.015485548517429995
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623494,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838795
},
"harness|hellaswag|10": {
"acc": 0.7057359091814379,
"acc_stderr": 0.004547798964126664,
"acc_norm": 0.8765186217884884,
"acc_norm_stderr": 0.003283165867631365
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049395,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049395
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.01625113971157077,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.01625113971157077
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687492,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687492
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4944920440636475,
"mc1_stderr": 0.017502438990451067,
"mc2": 0.642907732226627,
"mc2_stderr": 0.015485548517429995
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571752
},
"harness|gsm8k|5": {
"acc": 0.6755117513267627,
"acc_stderr": 0.012896095359768111
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.