datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
akoukas/autextification | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': generated
'1': human
splits:
- name: train
num_bytes: 10758176
num_examples: 33845
download_size: 6321075
dataset_size: 10758176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k](https://huggingface.co/OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T05:49:52.384662](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k/blob/main/results_2024-01-08T05-49-52.384662.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4353450707475209,\n\
\ \"acc_stderr\": 0.03461671516845463,\n \"acc_norm\": 0.43568385204742693,\n\
\ \"acc_norm_stderr\": 0.035330423938582683,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.44491612521505014,\n\
\ \"mc2_stderr\": 0.014935356559440623\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39334470989761094,\n \"acc_stderr\": 0.014275101465693024,\n\
\ \"acc_norm\": 0.45051194539249145,\n \"acc_norm_stderr\": 0.014539646098471627\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45717984465245964,\n\
\ \"acc_stderr\": 0.004971449552787173,\n \"acc_norm\": 0.6079466241784505,\n\
\ \"acc_norm_stderr\": 0.0048721072620824726\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779204,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779204\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.0398124054371786,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.0398124054371786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.44193548387096776,\n\
\ \"acc_stderr\": 0.02825155790684974,\n \"acc_norm\": 0.44193548387096776,\n\
\ \"acc_norm_stderr\": 0.02825155790684974\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5212121212121212,\n \"acc_stderr\": 0.03900828913737302,\n\
\ \"acc_norm\": 0.5212121212121212,\n \"acc_norm_stderr\": 0.03900828913737302\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.42487046632124353,\n \"acc_stderr\": 0.0356747133521254,\n\
\ \"acc_norm\": 0.42487046632124353,\n \"acc_norm_stderr\": 0.0356747133521254\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.02403548967633506,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.02403548967633506\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5100917431192661,\n \"acc_stderr\": 0.02143295620345332,\n \"\
acc_norm\": 0.5100917431192661,\n \"acc_norm_stderr\": 0.02143295620345332\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.0346022832723917,\n \"acc_norm\"\
: 0.4166666666666667,\n \"acc_norm_stderr\": 0.0346022832723917\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.5316455696202531,\n \"acc_stderr\": 0.03248197400511075,\n \"\
acc_norm\": 0.5316455696202531,\n \"acc_norm_stderr\": 0.03248197400511075\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.038818912133343826,\n\
\ \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.038818912133343826\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n\
\ \"acc_stderr\": 0.03011821010694265,\n \"acc_norm\": 0.6965811965811965,\n\
\ \"acc_norm_stderr\": 0.03011821010694265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4904214559386973,\n\
\ \"acc_stderr\": 0.017876682275340845,\n \"acc_norm\": 0.4904214559386973,\n\
\ \"acc_norm_stderr\": 0.017876682275340845\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095273,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095273\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4533762057877814,\n\
\ \"acc_stderr\": 0.02827435985489426,\n \"acc_norm\": 0.4533762057877814,\n\
\ \"acc_norm_stderr\": 0.02827435985489426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4506172839506173,\n \"acc_stderr\": 0.027684721415656203,\n\
\ \"acc_norm\": 0.4506172839506173,\n \"acc_norm_stderr\": 0.027684721415656203\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3324641460234681,\n\
\ \"acc_stderr\": 0.012032022332260512,\n \"acc_norm\": 0.3324641460234681,\n\
\ \"acc_norm_stderr\": 0.012032022332260512\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35947712418300654,\n \"acc_stderr\": 0.019412539242032165,\n \
\ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.019412539242032165\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5621890547263682,\n\
\ \"acc_stderr\": 0.0350808011219984,\n \"acc_norm\": 0.5621890547263682,\n\
\ \"acc_norm_stderr\": 0.0350808011219984\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4853801169590643,\n \"acc_stderr\": 0.038331852752130205,\n\
\ \"acc_norm\": 0.4853801169590643,\n \"acc_norm_stderr\": 0.038331852752130205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.44491612521505014,\n\
\ \"mc2_stderr\": 0.014935356559440623\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6219415943172849,\n \"acc_stderr\": 0.013628165460523239\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43669446550416985,\n \
\ \"acc_stderr\": 0.013661649780905493\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|arc:challenge|25_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|arc:challenge|25_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|gsm8k|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|gsm8k|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hellaswag|10_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hellaswag|10_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-19-01.672663.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T05-49-52.384662.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- '**/details_harness|winogrande|5_2024-01-08T02-19-01.672663.parquet'
- split: 2024_01_08T05_49_52.384662
path:
- '**/details_harness|winogrande|5_2024-01-08T05-49-52.384662.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T05-49-52.384662.parquet'
- config_name: results
data_files:
- split: 2024_01_08T02_19_01.672663
path:
- results_2024-01-08T02-19-01.672663.parquet
- split: 2024_01_08T05_49_52.384662
path:
- results_2024-01-08T05-49-52.384662.parquet
- split: latest
path:
- results_2024-01-08T05-49-52.384662.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k](https://huggingface.co/OpenBuddy/openbuddy-deepseekcoder-33b-v16.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T05:49:52.384662](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseekcoder-33b-v16.1-32k/blob/main/results_2024-01-08T05-49-52.384662.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4353450707475209,
"acc_stderr": 0.03461671516845463,
"acc_norm": 0.43568385204742693,
"acc_norm_stderr": 0.035330423938582683,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.44491612521505014,
"mc2_stderr": 0.014935356559440623
},
"harness|arc:challenge|25": {
"acc": 0.39334470989761094,
"acc_stderr": 0.014275101465693024,
"acc_norm": 0.45051194539249145,
"acc_norm_stderr": 0.014539646098471627
},
"harness|hellaswag|10": {
"acc": 0.45717984465245964,
"acc_stderr": 0.004971449552787173,
"acc_norm": 0.6079466241784505,
"acc_norm_stderr": 0.0048721072620824726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779204,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779204
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39622641509433965,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.39622641509433965,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.0398124054371786,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.0398124054371786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.02825155790684974,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.02825155790684974
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5212121212121212,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.5212121212121212,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.42487046632124353,
"acc_stderr": 0.0356747133521254,
"acc_norm": 0.42487046632124353,
"acc_norm_stderr": 0.0356747133521254
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.02403548967633506,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.02403548967633506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5100917431192661,
"acc_stderr": 0.02143295620345332,
"acc_norm": 0.5100917431192661,
"acc_norm_stderr": 0.02143295620345332
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.0346022832723917,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.0346022832723917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5316455696202531,
"acc_stderr": 0.03248197400511075,
"acc_norm": 0.5316455696202531,
"acc_norm_stderr": 0.03248197400511075
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.03011821010694265,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.03011821010694265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4904214559386973,
"acc_stderr": 0.017876682275340845,
"acc_norm": 0.4904214559386973,
"acc_norm_stderr": 0.017876682275340845
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095273,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095273
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4533762057877814,
"acc_stderr": 0.02827435985489426,
"acc_norm": 0.4533762057877814,
"acc_norm_stderr": 0.02827435985489426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4506172839506173,
"acc_stderr": 0.027684721415656203,
"acc_norm": 0.4506172839506173,
"acc_norm_stderr": 0.027684721415656203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3324641460234681,
"acc_stderr": 0.012032022332260512,
"acc_norm": 0.3324641460234681,
"acc_norm_stderr": 0.012032022332260512
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.019412539242032165,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.019412539242032165
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5621890547263682,
"acc_stderr": 0.0350808011219984,
"acc_norm": 0.5621890547263682,
"acc_norm_stderr": 0.0350808011219984
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4853801169590643,
"acc_stderr": 0.038331852752130205,
"acc_norm": 0.4853801169590643,
"acc_norm_stderr": 0.038331852752130205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.44491612521505014,
"mc2_stderr": 0.014935356559440623
},
"harness|winogrande|5": {
"acc": 0.6219415943172849,
"acc_stderr": 0.013628165460523239
},
"harness|gsm8k|5": {
"acc": 0.43669446550416985,
"acc_stderr": 0.013661649780905493
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
heliosprime/twitter_dataset_1713154126 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 5535
num_examples: 15
download_size: 10183
dataset_size: 5535
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713154126"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2 | ---
pretty_name: Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T17:51:45.656296](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2/blob/main/results_2024-01-22T17-51-45.656296.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5919635778509046,\n\
\ \"acc_stderr\": 0.03367247342299486,\n \"acc_norm\": 0.596476293969353,\n\
\ \"acc_norm_stderr\": 0.03436978735851277,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.017485542258489646,\n \"mc2\": 0.6407684674777628,\n\
\ \"mc2_stderr\": 0.015297982301051796\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5716723549488054,\n \"acc_stderr\": 0.014460496367599017,\n\
\ \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6471818362875921,\n\
\ \"acc_stderr\": 0.004768701562988875,\n \"acc_norm\": 0.8370842461661023,\n\
\ \"acc_norm_stderr\": 0.003685340687255413\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936525,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.037786210790920566,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.037786210790920566\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367405,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367405\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936337,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.027709359675032488,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.027709359675032488\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934837,\n\
\ \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934837\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712997,\n \
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712997\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.01787121776779024,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.01787121776779024\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734198,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734198\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35195530726256985,\n\
\ \"acc_stderr\": 0.01597266852368907,\n \"acc_norm\": 0.35195530726256985,\n\
\ \"acc_norm_stderr\": 0.01597266852368907\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721535,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721535\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336461,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336461\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.02976826352893311,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.02976826352893311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073063,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073063\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.017485542258489646,\n \"mc2\": 0.6407684674777628,\n\
\ \"mc2_stderr\": 0.015297982301051796\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36997725549658833,\n \
\ \"acc_stderr\": 0.013298661207727129\n }\n}\n```"
repo_url: https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|arc:challenge|25_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|gsm8k|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hellaswag|10_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-51-45.656296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T17-51-45.656296.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- '**/details_harness|winogrande|5_2024-01-22T17-51-45.656296.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T17-51-45.656296.parquet'
- config_name: results
data_files:
- split: 2024_01_22T17_51_45.656296
path:
- results_2024-01-22T17-51-45.656296.parquet
- split: latest
path:
- results_2024-01-22T17-51-45.656296.parquet
---
# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T17:51:45.656296](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca-DPO2/blob/main/results_2024-01-22T17-51-45.656296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5919635778509046,
"acc_stderr": 0.03367247342299486,
"acc_norm": 0.596476293969353,
"acc_norm_stderr": 0.03436978735851277,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.017485542258489646,
"mc2": 0.6407684674777628,
"mc2_stderr": 0.015297982301051796
},
"harness|arc:challenge|25": {
"acc": 0.5716723549488054,
"acc_stderr": 0.014460496367599017,
"acc_norm": 0.6186006825938567,
"acc_norm_stderr": 0.014194389086685247
},
"harness|hellaswag|10": {
"acc": 0.6471818362875921,
"acc_stderr": 0.004768701562988875,
"acc_norm": 0.8370842461661023,
"acc_norm_stderr": 0.003685340687255413
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.037786210790920566,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.037786210790920566
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.04971358884367405,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.04971358884367405
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936337,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.027709359675032488,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.027709359675032488
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934837,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934837
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712997,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712997
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.01787121776779024,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.01787121776779024
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734198,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734198
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35195530726256985,
"acc_stderr": 0.01597266852368907,
"acc_norm": 0.35195530726256985,
"acc_norm_stderr": 0.01597266852368907
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.02724561304721535,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.02724561304721535
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336461,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336461
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.02976826352893311,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.02976826352893311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073063,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073063
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.017485542258489646,
"mc2": 0.6407684674777628,
"mc2_stderr": 0.015297982301051796
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.36997725549658833,
"acc_stderr": 0.013298661207727129
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vngrs-ai/vngrs-web-corpus | ---
dataset_info:
features:
- name: text
dtype: string
- name: corpus
dtype: string
- name: original_id
dtype: int64
splits:
- name: train
num_bytes: 141807806497
num_examples: 50336214
download_size: 84893303434
dataset_size: 141807806497
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-nc-sa-4.0
language:
- tr
---
# Dataset Card for Dataset Name
vngrs-web-corpus is a mixed-dataset made of cleaned Turkish sections of [OSCAR-2201](https://huggingface.co/datasets/oscar-corpus/OSCAR-2201) and [mC4](https://huggingface.co/datasets/mc4).
This dataset is originally created for training [VBART](https://arxiv.org/abs/2403.01308) and later used for training [TURNA](https://arxiv.org/abs/2401.14373).
The cleaning procedures of this dataset are explained in Appendix A of the [VBART Paper](https://arxiv.org/abs/2401.14373).
It consists of 50.3M pages and 25.33B tokens when tokenized by VBART Tokenizer.
## Dataset Details
### Dataset Description
- **Curated by:** [VNGRS-AI](https://vngrs.com/ai/)
- **Language (NLP):** Turkish
- **License:** cc-by-nc-sa-4.0
## Uses
vngrs-web-corpus is mainly intended to pretrain language models and word representations.
## Dataset Structure
- **text**[Str]: main text content of dataset
- **corpus**[Str]: source corpus
- **original_id**[Int]: original index of data at the source corpus
## Bias, Risks, and Limitations
This dataset holds content crawled on the open web. It is cleaned based on a set of rules and heuristics without accounting for the semantics of the content.
In cases where the content is irrelevant or inappropriate, it should be flagged and removed accordingly.
The dataset is intended for research purposes only and should not be used for any other purposes without prior consent from the relevant authorities.
## Citation
All attributions should be made to VBART paper.
```
@article{turker2024vbart,
title={VBART: The Turkish LLM},
author={Turker, Meliksah and Ari, Erdi and Han, Aydin},
journal={arXiv preprint arXiv:2403.01308},
year={2024}
}
``` |
Drizer/instruct-qa-20k | ---
license: openrail
---
|
reginaboateng/ebmnlp_pico | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: chunk_tags
sequence: string
- name: pos_tags
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': I-INT
'2': I-OUT
'3': I-PAR
splits:
- name: train
num_bytes: 27639457
num_examples: 23952
- name: test
num_bytes: 1482781
num_examples: 2065
- name: dev
num_bytes: 7446993
num_examples: 7049
download_size: 4095965
dataset_size: 36569231
---
# Dataset Card for "ebmnlp_pico"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Glaud/owls | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 523524.0
num_examples: 6
download_size: 524962
dataset_size: 523524.0
---
# Dataset Card for "owls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_databricks__dolly-v2-7b | ---
pretty_name: Evaluation run of databricks/dolly-v2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [databricks/dolly-v2-7b](https://huggingface.co/databricks/dolly-v2-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_databricks__dolly-v2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T13:27:34.576106](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-7b/blob/main/results_2023-10-15T13-27-34.576106.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.00044451099905589976,\n \"f1\": 0.059697986577181554,\n\
\ \"f1_stderr\": 0.0013648879248414308,\n \"acc\": 0.3060018322459733,\n\
\ \"acc_stderr\": 0.008342799872753168\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589976,\n\
\ \"f1\": 0.059697986577181554,\n \"f1_stderr\": 0.0013648879248414308\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \
\ \"acc_stderr\": 0.002920666198788728\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6006314127861089,\n \"acc_stderr\": 0.013764933546717607\n\
\ }\n}\n```"
repo_url: https://huggingface.co/databricks/dolly-v2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T13_27_34.576106
path:
- '**/details_harness|drop|3_2023-10-15T13-27-34.576106.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T13-27-34.576106.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T13_27_34.576106
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-27-34.576106.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-27-34.576106.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:46:56.588473.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T11:46:56.588473.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T13_27_34.576106
path:
- '**/details_harness|winogrande|5_2023-10-15T13-27-34.576106.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T13-27-34.576106.parquet'
- config_name: results
data_files:
- split: 2023_07_18T11_46_56.588473
path:
- results_2023-07-18T11:46:56.588473.parquet
- split: 2023_10_15T13_27_34.576106
path:
- results_2023-10-15T13-27-34.576106.parquet
- split: latest
path:
- results_2023-10-15T13-27-34.576106.parquet
---
# Dataset Card for Evaluation run of databricks/dolly-v2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/databricks/dolly-v2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [databricks/dolly-v2-7b](https://huggingface.co/databricks/dolly-v2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_databricks__dolly-v2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T13:27:34.576106](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-7b/blob/main/results_2023-10-15T13-27-34.576106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589976,
"f1": 0.059697986577181554,
"f1_stderr": 0.0013648879248414308,
"acc": 0.3060018322459733,
"acc_stderr": 0.008342799872753168
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589976,
"f1": 0.059697986577181554,
"f1_stderr": 0.0013648879248414308
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.002920666198788728
},
"harness|winogrande|5": {
"acc": 0.6006314127861089,
"acc_stderr": 0.013764933546717607
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zolak/twitter_dataset_78_1713226798 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 124398
num_examples: 297
download_size: 68976
dataset_size: 124398
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zambezivoice/zambezivoice_lozi_text | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 185840
num_examples: 2525
download_size: 107478
dataset_size: 185840
---
# Dataset Card for "zambezivoice_lozi_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/merge_new_para_detection_data_v8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 12768876.9
num_examples: 75600
- name: test
num_bytes: 1418764.1
num_examples: 8400
download_size: 6418901
dataset_size: 14187641.0
---
# Dataset Card for "merge_new_para_detection_data_v8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Noidriy/chaika-images | ---
license: cc-by-3.0
---
|
mthxz/ben10_RVCV1 | ---
license: unknown
---
|
saklee/test999 | ---
license: bigscience-bloom-rail-1.0
task_categories:
- zero-shot-classification
language:
- af
tags:
- finance
size_categories:
- 10B<n<100B
---
#11111 |
Amirjalaly/ooast_prompts | ---
dataset_info:
features:
- name: response
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 70129108
num_examples: 26370
download_size: 27087145
dataset_size: 70129108
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
OddBunny/fox_femboy | ---
license: cc-by-nc-nd-4.0
---
|
sproos/summeval-fr | ---
dataset_info:
features:
- name: machine_summaries
sequence: string
- name: human_summaries
sequence: string
- name: relevance
sequence: float64
- name: coherence
sequence: float64
- name: fluency
sequence: float64
- name: consistency
sequence: float64
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1276634
num_examples: 100
download_size: 503320
dataset_size: 1276634
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "summeval-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/5b0a064f | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1340
dataset_size: 184
---
# Dataset Card for "5b0a064f"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mangostin2010/KakaoChatData-alpaca | ---
license: other
---
|
joey234/mmlu-moral_scenarios | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 7379
num_examples: 5
- name: test
num_bytes: 4986899
num_examples: 895
download_size: 339959
dataset_size: 4994278
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-moral_scenarios"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ddgpath/rcc | ---
license: bigscience-openrail-m
---
|
adalib/fate_flow-data | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 1735944
num_examples: 99
- name: test
num_bytes: 155317
num_examples: 19
download_size: 572287
dataset_size: 1891261
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Guizmus/DreamboothTrainingExample | ---
license: creativeml-openrail-m
---
|
Lollitor/MyPubChem50 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7402208.4
num_examples: 45000
- name: validation
num_bytes: 822467.6
num_examples: 5000
download_size: 2583257
dataset_size: 8224676.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "MyPubChem50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/metatree_fri_c2_1000_25 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 156640
num_examples: 712
- name: validation
num_bytes: 63360
num_examples: 288
download_size: 254296
dataset_size: 220000
---
# Dataset Card for "metatree_fri_c2_1000_25"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
openerotica/Lamia | ---
license: apache-2.0
---
|
bigscience-data/roots_indic-hi_iitb_english_hindi_corpus | ---
language: hi
license: cc-by-nc-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-hi_iitb_english_hindi_corpus
# IITB English-Hindi Corpus
- Dataset uid: `iitb_english_hindi_corpus`
### Description
The IIT Bombay English-Hindi corpus contains parallel corpus for English-Hindi as well as monolingual Hindi corpus collected from a variety of existing sources and corpora developed at the Center for Indian Language Technology, IIT Bombay over the years. This corpus has been used at the Workshop on Asian Language Translation Shared Task since 2016 the Hindi-to-English and English-to-Hindi languages pairs and as a pivot language pair for the Hindi-to-Japanese and Japanese-to-Hindi language pairs.
### Homepage
https://www.cfilt.iitb.ac.in/iitb_parallel/
### Licensing
- non-commercial use
- cc-by-nc-nd-4.0: Creative Commons Attribution Non Commercial No Derivatives 4.0 International
### Speaker Locations
- Southern Asia
- India
- Pakistan
### Sizes
- 0.6512 % of total
- 28.5802 % of indic-hi
### BigScience processing steps
#### Filters applied to: indic-hi
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
mychen76/color_terms_tinyllama2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5073062.918552837
num_examples: 27109
- name: test
num_bytes: 1268406.0814471627
num_examples: 6778
- name: validation
num_bytes: 253756.07058754095
num_examples: 1356
download_size: 2950539
dataset_size: 6595225.070587541
---
# Dataset Card for "color_terms_tinyllama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ajoshi-6/insincere-subset | ---
license: mit
---
|
open-llm-leaderboard/details_MrNJK__gpt2-xl-sft | ---
pretty_name: Evaluation run of MrNJK/gpt2-xl-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MrNJK/gpt2-xl-sft](https://huggingface.co/MrNJK/gpt2-xl-sft) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MrNJK__gpt2-xl-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T20:10:52.677287](https://huggingface.co/datasets/open-llm-leaderboard/details_MrNJK__gpt2-xl-sft/blob/main/results_2023-09-17T20-10-52.677287.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.000405845113241776,\n \"f1\": 0.053466862416107416,\n\
\ \"f1_stderr\": 0.0012595479932490756,\n \"acc\": 0.28161237645653686,\n\
\ \"acc_stderr\": 0.00817723914058038\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241776,\n\
\ \"f1\": 0.053466862416107416,\n \"f1_stderr\": 0.0012595479932490756\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \
\ \"acc_stderr\": 0.0023892815120772075\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5556432517758485,\n \"acc_stderr\": 0.013965196769083553\n\
\ }\n}\n```"
repo_url: https://huggingface.co/MrNJK/gpt2-xl-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T20_10_52.677287
path:
- '**/details_harness|drop|3_2023-09-17T20-10-52.677287.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T20-10-52.677287.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T20_10_52.677287
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-10-52.677287.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-10-52.677287.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:21:02.216696.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:21:02.216696.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T20_10_52.677287
path:
- '**/details_harness|winogrande|5_2023-09-17T20-10-52.677287.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T20-10-52.677287.parquet'
- config_name: results
data_files:
- split: 2023_08_09T09_21_02.216696
path:
- results_2023-08-09T09:21:02.216696.parquet
- split: 2023_09_17T20_10_52.677287
path:
- results_2023-09-17T20-10-52.677287.parquet
- split: latest
path:
- results_2023-09-17T20-10-52.677287.parquet
---
# Dataset Card for Evaluation run of MrNJK/gpt2-xl-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MrNJK/gpt2-xl-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MrNJK/gpt2-xl-sft](https://huggingface.co/MrNJK/gpt2-xl-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MrNJK__gpt2-xl-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T20:10:52.677287](https://huggingface.co/datasets/open-llm-leaderboard/details_MrNJK__gpt2-xl-sft/blob/main/results_2023-09-17T20-10-52.677287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241776,
"f1": 0.053466862416107416,
"f1_stderr": 0.0012595479932490756,
"acc": 0.28161237645653686,
"acc_stderr": 0.00817723914058038
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241776,
"f1": 0.053466862416107416,
"f1_stderr": 0.0012595479932490756
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772075
},
"harness|winogrande|5": {
"acc": 0.5556432517758485,
"acc_stderr": 0.013965196769083553
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
raiaman/up | ---
license: unknown
---
|
fahmiaziz/fingpt | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4446149
num_examples: 2500
download_size: 2495871
dataset_size: 4446149
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ssaigne/HINATA_HAJIME | ---
license: openrail
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rusano/Teli5_1K_tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: decoder_attention_mask
sequence: int64
splits:
- name: train
num_bytes: 2013563824
num_examples: 218107
- name: val
num_bytes: 503393264
num_examples: 54527
download_size: 651358784
dataset_size: 2516957088
---
# Dataset Card for "Teli5_1K_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sam1120/safety-userstudy-terrain | ---
dataset_info:
features:
- name: name
dtype: string
- name: pixel_values
dtype: image
- name: labels
dtype: image
splits:
- name: train
num_bytes: 42072002.0
num_examples: 15
download_size: 12479203
dataset_size: 42072002.0
---
# Dataset Card for "safety-userstudy-terrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
3ee/regularization-tiger | ---
license: mit
tags:
- stable-diffusion
- regularization-images
- text-to-image
- image-to-image
- dreambooth
- class-instance
- preservation-loss-training
---
# Tiger Regularization Images
A collection of regularization & class instance datasets of tigers for the Stable Diffusion 1.5 to use for DreamBooth prior preservation loss training. |
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_CyberTron
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_AI_CyberTron](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T15:25:26.631470](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron/blob/main/results_2024-04-15T15-25-26.631470.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6207341899016878,\n\
\ \"acc_stderr\": 0.03268647985396864,\n \"acc_norm\": 0.6244114490529663,\n\
\ \"acc_norm_stderr\": 0.03334075921277243,\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.6122385287224691,\n\
\ \"mc2_stderr\": 0.01514349395606483\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522084,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6326428998207528,\n\
\ \"acc_stderr\": 0.004810996652324729,\n \"acc_norm\": 0.8222465644293966,\n\
\ \"acc_norm_stderr\": 0.003815237269961105\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886804,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266854,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266854\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437385,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437385\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n\
\ \"acc_stderr\": 0.01648913496243895,\n \"acc_norm\": 0.41675977653631285,\n\
\ \"acc_norm_stderr\": 0.01648913496243895\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824087,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824087\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580214,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580214\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6258169934640523,\n \"acc_stderr\": 0.019576953122088833,\n \
\ \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.019576953122088833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595954,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595954\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.6122385287224691,\n\
\ \"mc2_stderr\": 0.01514349395606483\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838227\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47536012130401817,\n \
\ \"acc_stderr\": 0.013755751352764915\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|arc:challenge|25_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|gsm8k|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hellaswag|10_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-25-26.631470.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T15-25-26.631470.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- '**/details_harness|winogrande|5_2024-04-15T15-25-26.631470.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T15-25-26.631470.parquet'
- config_name: results
data_files:
- split: 2024_04_15T15_25_26.631470
path:
- results_2024-04-15T15-25-26.631470.parquet
- split: latest
path:
- results_2024-04-15T15-25-26.631470.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_CyberTron
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_CyberTron](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T15:25:26.631470](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron/blob/main/results_2024-04-15T15-25-26.631470.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6207341899016878,
"acc_stderr": 0.03268647985396864,
"acc_norm": 0.6244114490529663,
"acc_norm_stderr": 0.03334075921277243,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.6122385287224691,
"mc2_stderr": 0.01514349395606483
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522084,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6326428998207528,
"acc_stderr": 0.004810996652324729,
"acc_norm": 0.8222465644293966,
"acc_norm_stderr": 0.003815237269961105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886804,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266854,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266854
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437385,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437385
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.01648913496243895,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.01648913496243895
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580214,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580214
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6258169934640523,
"acc_stderr": 0.019576953122088833,
"acc_norm": 0.6258169934640523,
"acc_norm_stderr": 0.019576953122088833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595954,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595954
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.6122385287224691,
"mc2_stderr": 0.01514349395606483
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838227
},
"harness|gsm8k|5": {
"acc": 0.47536012130401817,
"acc_stderr": 0.013755751352764915
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_acomp_focusing_like | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 40597
num_examples: 196
- name: test
num_bytes: 13668
num_examples: 83
- name: train
num_bytes: 49631
num_examples: 266
download_size: 76381
dataset_size: 103896
---
# Dataset Card for "MULTI_VALUE_stsb_acomp_focusing_like"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juliaturc/rick-and-morty-s06e01-blip-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 78803729.742
num_examples: 1341
download_size: 78105717
dataset_size: 78803729.742
---
# Dataset Card for "rick-and-morty-s06e01-blip-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_ca_vilaquad | ---
language: ca
license: cc-by-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_ca_vilaquad
# UIT-ViQuAD – A Vietnamese Dataset for Evaluating Machine Reading Comprehension.
- Dataset uid: `vilaquad`
### Description
Vietnamese Question Answering Dataset (UIT-ViQuAD), a new
dataset for the low-resource language as Vietnamese to evaluate MRC models. This dataset comprises over 23,000 human-generated question-answer pairs based on 5,109 passages of 174 Vietnamese articles from Wikipedia.
### Homepage
https://sites.google.com/uit.edu.vn/uit-nlp/datasets-projects
### Licensing
- open license
- cc-by-nc-sa-4.0: Creative Commons Attribution Non Commercial Share Alike 4.0 International
Creative Commons Attribution 4.0 International License
### Speaker Locations
- South-eastern Asia
- Vietnam
### Sizes
- 0.0001 % of total
- 0.0065 % of ca
### BigScience processing steps
#### Filters applied to: ca
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
Francesco/x-ray-rheumatology | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': x-ray-rheumatology
'1': artefact
'2': distal phalanges
'3': fifth metacarpal bone
'4': first metacarpal bone
'5': fourth metacarpal bone
'6': intermediate phalanges
'7': proximal phalanges
'8': radius
'9': second metacarpal bone
'10': soft tissue calcination
'11': third metacarpal bone
'12': ulna
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: x-ray-rheumatology
tags:
- rf100
---
# Dataset Card for x-ray-rheumatology
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/x-ray-rheumatology
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
x-ray-rheumatology
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/x-ray-rheumatology
### Citation Information
```
@misc{ x-ray-rheumatology,
title = { x ray rheumatology Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/x-ray-rheumatology } },
url = { https://universe.roboflow.com/object-detection/x-ray-rheumatology },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
Vertex-Test/FireSmokeDataset | ---
license: apache-2.0
---
|
michaelginn/childes_phones | ---
dataset_info:
features:
- name: line
dtype: string
- name: file
dtype: string
- name: ipa
dtype: string
- name: ipa_syll
dtype: string
splits:
- name: train
num_bytes: 2792238
num_examples: 28466
download_size: 1400857
dataset_size: 2792238
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "childes_phones"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dotan1111/MSA-nuc-3-seq | ---
tags:
- sequence-to-sequence
- bioinformatics
- biology
---
# Multiple Sequence Alignment as a Sequence-to-Sequence Learning Problem
## Abstract:
The sequence alignment problem is one of the most fundamental problems in bioinformatics and a plethora of methods were devised to tackle it. Here we introduce BetaAlign, a methodology for aligning sequences using an NLP approach. BetaAlign accounts for the possible variability of the evolutionary process among different datasets by using an ensemble of transformers, each trained on millions of samples generated from a different evolutionary model. Our approach leads to alignment accuracy that is similar and often better than commonly used methods, such as MAFFT, DIALIGN, ClustalW, T-Coffee, PRANK, and MUSCLE.

An illustration of aligning sequences with sequence-to-sequence learning. (a) Consider two input sequences "AAG" and "ACGG". (b) The result of encoding the unaligned sequences into the source language (*Concat* representation). (c) The sentence from the source language is translated to the target language via a transformer model. (d) The translated sentence in the target language (*Spaces* representation). (e) The resulting alignment, decoded from the translated sentence, in which "AA-G" is aligned to "ACGG". The transformer architecture illustration is adapted from (Vaswani et al., 2017).
## Data:
We used SpartaABC (Loewenthal et al., 2021) to generate millions of true alignments. SpartaABC requires the following input: (1) a rooted phylogenetic tree, which includes a topology and branch lengths; (2) a substitution model (amino acids or nucleotides); (3) root sequence length; (4) the indel model parameters, which include: insertion rate (*R_I*), deletion rate (*R_D*), a parameter for the insertion Zipfian distribution (*A_I*), and a parameter for the deletion Zipfian distribution (*A_D*). MSAs were simulated along random phylogenetic tree topologies generated using the program ETE version 3.0 (Huerta-Cepas et al., 2016) with default parameters.
We generated 1,495,000, 2,000 and 3,000, protein MSAs with ten sequences that were used as training validation and testing data, respectively. We generated the same number of DNA MSAs. For each random tree, branch lengths were drawn from a uniform distribution in the range *(0.5,1.0)*. Next, the sequences were generated using SpartaABC with the following parameters: *R_I,R_D \in (0.0,0.05)*, *A_I, A_D \in (1.01,2.0)*. The alignment lengths as well as the sequence lengths of the tree leaves vary within and among datasets as they depend on the indel dynamics and the root length. The root length was sampled uniformly in the range *[32,44]*. Unless stated otherwise, all protein datasets were generated with the WAG+G model, and all DNA datasets were generated with the GTR+G model, with the following parameters: (1) frequencies for the different nucleotides *(0.37, 0.166, 0.307, 0.158)*, in the order "T", "C", "A" and "G"; (2) with the substitutions rate *(0.444, 0.0843, 0.116, 0.107, 0.00027)*, in the order "a", "b", "c", "d", and "e" for the substitution matrix.
## Example:
The following example correspond for the illustrated MSA in the figure above:
{"MSA": "AAAC-GGG", "unaligned_seqs": {"seq0": "AAG", "seq1": "ACGG"}}
## APA
```
Dotan, E., Belinkov, Y., Avram, O., Wygoda, E., Ecker, N., Alburquerque, M., Keren, O., Loewenthal, G., & Pupko T. (2023). Multiple sequence alignment as a sequence-to-sequence learning problem. The Eleventh International Conference on Learning Representations (ICLR 2023).
```
## BibTeX
```
@article{Dotan_multiple_2023,
author = {Dotan, Edo and Belinkov, Yonatan and Avram, Oren and Wygoda, Elya and Ecker, Noa and Alburquerque, Michael and Keren, Omri and Loewenthal, Gil and Pupko, Tal},
month = aug,
title = {{Multiple sequence alignment as a sequence-to-sequence learning problem}},
year = {2023}
}
``` |
Falah/Futuristic_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 28829161
num_examples: 100000
download_size: 2389938
dataset_size: 28829161
---
# Dataset Card for "Futuristic_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-boolq-default-cb11e4-46279145185 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- boolq
eval_info:
task: natural_language_inference
model: andi611/distilbert-base-uncased-qa-boolq
metrics: []
dataset_name: boolq
dataset_config: default
dataset_split: train
col_mapping:
text1: passage
text2: question
target: answer
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: andi611/distilbert-base-uncased-qa-boolq
* Dataset: boolq
* Config: default
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mabuyun](https://huggingface.co/mabuyun) for evaluating this model. |
open-llm-leaderboard/details_Weyaxi__Dolphin-Nebula-7B | ---
pretty_name: Evaluation run of Weyaxi/Dolphin-Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Dolphin-Nebula-7B](https://huggingface.co/Weyaxi/Dolphin-Nebula-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Dolphin-Nebula-7B\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T14:25:38.586013](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Dolphin-Nebula-7B/blob/main/results_2023-12-02T14-25-38.586013.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3305534495830174,\n\
\ \"acc_stderr\": 0.012957496367085026\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.3305534495830174,\n \"acc_stderr\": 0.012957496367085026\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Dolphin-Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T14_25_38.586013
path:
- '**/details_harness|gsm8k|5_2023-12-02T14-25-38.586013.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T14-25-38.586013.parquet'
- config_name: results
data_files:
- split: 2023_12_02T14_25_38.586013
path:
- results_2023-12-02T14-25-38.586013.parquet
- split: latest
path:
- results_2023-12-02T14-25-38.586013.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Dolphin-Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Dolphin-Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Dolphin-Nebula-7B](https://huggingface.co/Weyaxi/Dolphin-Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Dolphin-Nebula-7B",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T14:25:38.586013](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Dolphin-Nebula-7B/blob/main/results_2023-12-02T14-25-38.586013.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3305534495830174,
"acc_stderr": 0.012957496367085026
},
"harness|gsm8k|5": {
"acc": 0.3305534495830174,
"acc_stderr": 0.012957496367085026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Multimodal-Fatima/DTD_parition1_test_facebook_opt_6.7b_Attributes_Caption_ns_1880_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 92259840.0
num_examples: 1880
- name: fewshot_3_bs_16
num_bytes: 93271918.0
num_examples: 1880
download_size: 181110966
dataset_size: 185531758.0
---
# Dataset Card for "DTD_parition1_test_facebook_opt_6.7b_Attributes_Caption_ns_1880_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
claudiotarbe/voices | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_164 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1129044068.0
num_examples: 221729
download_size: 1152434732
dataset_size: 1129044068.0
---
# Dataset Card for "chunk_164"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713015504 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 29643
num_examples: 78
download_size: 16933
dataset_size: 29643
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/saileach_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saileach/サイラッハ/琴柳 (Arknights)
This is the dataset of saileach/サイラッハ/琴柳 (Arknights), containing 330 images and their tags.
The core tags of this character are `long_hair, blonde_hair, horns, blue_eyes, pointy_ears, breasts, very_long_hair, hairband, large_breasts, blue_hairband, braid, dragon_horns, hair_between_eyes, twin_braids`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 330 | 722.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saileach_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 330 | 585.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saileach_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 843 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/saileach_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saileach_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, blue_necktie, elbow_gloves, looking_at_viewer, solo, upper_body, white_shirt, smile, fingerless_gloves, hand_up, simple_background, white_background, arm_strap, blush, grey_background, hand_on_own_chest, medium_breasts |
| 1 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, upper_body, blue_necktie, smile, bare_shoulders, simple_background, white_background, white_shirt, arm_strap, closed_mouth |
| 2 | 29 |  |  |  |  |  | 1girl, solo, bare_shoulders, black_skirt, white_shirt, blue_necktie, elbow_gloves, miniskirt, looking_at_viewer, standing, zettai_ryouiki, black_gloves, cowboy_shot, white_thighhighs, arm_strap, fingerless_gloves, thighs, smile, standard_bearer, holding_flag, holding_weapon, sword, pouch |
| 3 | 39 |  |  |  |  |  | 1girl, solo, bare_shoulders, white_dress, official_alternate_costume, off-shoulder_dress, looking_at_viewer, flower, white_gloves, smile, cleavage, choker, strapless, holding_umbrella |
| 4 | 8 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, solo, thighs, casual_one-piece_swimsuit, official_alternate_costume, thigh_strap, black_one-piece_swimsuit, hair_flower, smile, blush, navel, sitting, nail_polish, red_flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | blue_necktie | elbow_gloves | looking_at_viewer | solo | upper_body | white_shirt | smile | fingerless_gloves | hand_up | simple_background | white_background | arm_strap | blush | grey_background | hand_on_own_chest | medium_breasts | closed_mouth | black_skirt | miniskirt | standing | zettai_ryouiki | cowboy_shot | white_thighhighs | thighs | standard_bearer | holding_flag | holding_weapon | sword | pouch | white_dress | official_alternate_costume | off-shoulder_dress | flower | white_gloves | cleavage | choker | strapless | holding_umbrella | casual_one-piece_swimsuit | thigh_strap | black_one-piece_swimsuit | hair_flower | navel | sitting | nail_polish | red_flower |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:---------------|:---------------|:--------------------|:-------|:-------------|:--------------|:--------|:--------------------|:----------|:--------------------|:-------------------|:------------|:--------|:------------------|:--------------------|:-----------------|:---------------|:--------------|:------------|:-----------|:-----------------|:--------------|:-------------------|:---------|:------------------|:---------------|:-----------------|:--------|:--------|:--------------|:-----------------------------|:---------------------|:---------|:---------------|:-----------|:---------|:------------|:-------------------|:----------------------------|:--------------|:---------------------------|:--------------|:--------|:----------|:--------------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | | X | | X | X | X | X | X | | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 3 | 39 |  |  |  |  |  | X | X | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | | | | X | X | | | X | | | | | | X | | | | | | | | | | | X | | | | | | | X | | | | X | | | | X | X | X | X | X | X | X | X |
|
CyberHarem/shibuya_kanon_lovelivesuperstar | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shibuya_kanon/澁谷かのん/시부야카논 (Love Live! Superstar!!)
This is the dataset of shibuya_kanon/澁谷かのん/시부야카논 (Love Live! Superstar!!), containing 500 images and their tags.
The core tags of this character are `bangs, orange_hair, purple_eyes, long_hair, ribbon, neck_ribbon, red_ribbon, shiny_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 849.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shibuya_kanon_lovelivesuperstar/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 381.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shibuya_kanon_lovelivesuperstar/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1294 | 889.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shibuya_kanon_lovelivesuperstar/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 699.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shibuya_kanon_lovelivesuperstar/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1294 | 1.41 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shibuya_kanon_lovelivesuperstar/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shibuya_kanon_lovelivesuperstar',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, :d, blue_jacket, collared_shirt, grey_dress, looking_at_viewer, open_jacket, open_mouth, pinafore_dress, solo, white_shirt, yuigaoka_school_uniform, long_sleeves, shiny, upper_body, blush, medium_hair |
| 1 | 9 |  |  |  |  |  | 1girl, blue_jacket, collared_shirt, open_jacket, smile, solo, white_background, white_shirt, yuigaoka_school_uniform, blush, closed_mouth, grey_dress, looking_at_viewer, shiny, simple_background, upper_body, long_sleeves, hair_between_eyes, pinafore_dress |
| 2 | 42 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, upper_body, blush, earrings, birthday, long_sleeves, dress, hat, open_mouth, star_(symbol), medium_hair |
| 3 | 6 |  |  |  |  |  | 1girl, blush, smile, solo, white_gloves, looking_at_viewer, open_mouth, white_dress, elbow_gloves, short_sleeves, blue_hairband, breasts, upper_body |
| 4 | 7 |  |  |  |  |  | 1girl, :d, open_mouth, short_sleeves, solo, blush, hat, blue_sky, looking_at_viewer, belt_buckle, blue_belt, cloud, white_belt, white_headwear, white_shirt, medium_hair, white_skirt |
| 5 | 8 |  |  |  |  |  | 1girl, collarbone, solo, blush, hair_scrunchie, looking_at_viewer, medium_hair, open_mouth, shorts, sweat, breasts, simple_background, white_shirt, hair_between_eyes, off_shoulder, short_sleeves, blue_scrunchie, holding, pants, shiny, shoes, short_hair, swept_bangs, towel, water_bottle, white_background |
| 6 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, maid_headdress, solo, cowboy_shot, white_apron, enmaided, standing, blush, hair_between_eyes, orange_skirt, smile, collared_shirt, dress_shirt, frilled_skirt, miniskirt, orange_bowtie, shiny, open_mouth, white_background, wing_collar, frilled_apron, holding_plate, puffy_short_sleeves, simple_background, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | :d | blue_jacket | collared_shirt | grey_dress | looking_at_viewer | open_jacket | open_mouth | pinafore_dress | solo | white_shirt | yuigaoka_school_uniform | long_sleeves | shiny | upper_body | blush | medium_hair | smile | white_background | closed_mouth | simple_background | hair_between_eyes | earrings | birthday | dress | hat | star_(symbol) | white_gloves | white_dress | elbow_gloves | short_sleeves | blue_hairband | breasts | blue_sky | belt_buckle | blue_belt | cloud | white_belt | white_headwear | white_skirt | collarbone | hair_scrunchie | shorts | sweat | off_shoulder | blue_scrunchie | holding | pants | shoes | short_hair | swept_bangs | towel | water_bottle | maid_headdress | cowboy_shot | white_apron | enmaided | standing | orange_skirt | dress_shirt | frilled_skirt | miniskirt | orange_bowtie | wing_collar | frilled_apron | holding_plate | puffy_short_sleeves | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----|:--------------|:-----------------|:-------------|:--------------------|:--------------|:-------------|:-----------------|:-------|:--------------|:--------------------------|:---------------|:--------|:-------------|:--------|:--------------|:--------|:-------------------|:---------------|:--------------------|:--------------------|:-----------|:-----------|:--------|:------|:----------------|:---------------|:--------------|:---------------|:----------------|:----------------|:----------|:-----------|:--------------|:------------|:--------|:-------------|:-----------------|:--------------|:-------------|:-----------------|:---------|:--------|:---------------|:-----------------|:----------|:--------|:--------|:-------------|:--------------|:--------|:---------------|:-----------------|:--------------|:--------------|:-----------|:-----------|:---------------|:--------------|:----------------|:------------|:----------------|:--------------|:----------------|:----------------|:----------------------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | X | X | X | X | | X | X | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 42 |  |  |  |  |  | X | | | | | X | | X | | X | | | X | | X | X | X | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | X | | X | | X | | | | | X | X | | X | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | | X | | X | | X | X | | | | | X | X | | | | | | | | | X | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | | X | | X | | X | X | | | X | | X | X | | X | | X | X | | | | | | | | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | | X | | X | | X | | X | | | | X | | X | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/minase_iori_theidolmster | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of minase_iori/水瀬伊織/미나세이오리 (THE iDOLM@STER)
This is the dataset of minase_iori/水瀬伊織/미나세이오리 (THE iDOLM@STER), containing 500 images and their tags.
The core tags of this character are `long_hair, brown_hair, hairband, brown_eyes, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 496.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minase_iori_theidolmster/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 338.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minase_iori_theidolmster/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1126 | 664.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minase_iori_theidolmster/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 457.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minase_iori_theidolmster/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1126 | 860.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minase_iori_theidolmster/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/minase_iori_theidolmster',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, dress, solo, blush, black_thighhighs, bow, zettai_ryouiki |
| 1 | 7 |  |  |  |  |  | 1girl, dress, rabbit, solo, stuffed_animal, stuffed_bunny, blush, open_mouth, sitting, smile |
| 2 | 7 |  |  |  |  |  | 1girl, solo, stuffed_animal, stuffed_bunny, dress, smile, one_eye_closed |
| 3 | 5 |  |  |  |  |  | 1girl, black_thighhighs, skirt, solo, zettai_ryouiki, plaid, smile, bespectacled, necktie |
| 4 | 6 |  |  |  |  |  | 1girl, bracelet, solo, dress, bare_shoulders, blush, looking_at_viewer, smile, open_mouth |
| 5 | 16 |  |  |  |  |  | 1girl, necklace, solo, beret, dress, thighhighs, belt, smile, earrings, one_eye_closed, wrist_cuffs, bare_shoulders, open_mouth |
| 6 | 6 |  |  |  |  |  | 1girl, solo, looking_at_viewer, sailor_bikini, white_bikini, blush, navel, sitting, bow, breasts, open_mouth, simple_background, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | solo | blush | black_thighhighs | bow | zettai_ryouiki | rabbit | stuffed_animal | stuffed_bunny | open_mouth | sitting | smile | one_eye_closed | skirt | plaid | bespectacled | necktie | bracelet | bare_shoulders | looking_at_viewer | necklace | beret | thighhighs | belt | earrings | wrist_cuffs | sailor_bikini | white_bikini | navel | breasts | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:-------------------|:------|:-----------------|:---------|:-----------------|:----------------|:-------------|:----------|:--------|:-----------------|:--------|:--------|:---------------|:----------|:-----------|:-----------------|:--------------------|:-----------|:--------|:-------------|:-------|:-----------|:--------------|:----------------|:---------------|:--------|:----------|:--------------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | | | | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | X | | X | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | X | | X | | | | | | X | X | X | | | | | | | | | | | | |
| 5 | 16 |  |  |  |  |  | X | X | X | | | | | | | | X | | X | X | | | | | | X | | X | X | X | X | X | X | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | X | | X | | | | | X | X | X | | | | | | | | X | | | | | | | X | X | X | X | X | X |
|
Codec-SUPERB/gunshot_triangulation_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: id
dtype: string
splits:
- name: original
num_bytes: 12677868.0
num_examples: 88
- name: academicodec_hifi_16k_320d
num_bytes: 4229944.0
num_examples: 88
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 4229944.0
num_examples: 88
- name: academicodec_hifi_24k_320d
num_bytes: 6313784.0
num_examples: 88
- name: audiodec_24k_320d
num_bytes: 6341956.0
num_examples: 88
- name: dac_16k
num_bytes: 4229944.0
num_examples: 88
- name: dac_24k
num_bytes: 6341944.0
num_examples: 88
- name: dac_44k
num_bytes: 11648344.0
num_examples: 88
- name: encodec_24k_12bps
num_bytes: 6341944.0
num_examples: 88
- name: encodec_24k_1_5bps
num_bytes: 6341944.0
num_examples: 88
- name: encodec_24k_24bps
num_bytes: 6341944.0
num_examples: 88
- name: encodec_24k_3bps
num_bytes: 6341944.0
num_examples: 88
- name: encodec_24k_6bps
num_bytes: 6341944.0
num_examples: 88
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 4229944.0
num_examples: 88
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 4229944.0
num_examples: 88
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 4229944.0
num_examples: 88
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 4229944.0
num_examples: 88
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 4229944.0
num_examples: 88
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 4229944.0
num_examples: 88
- name: speech_tokenizer_16k
num_bytes: 4229944.0
num_examples: 88
download_size: 110782805
dataset_size: 117333056.0
---
# Dataset Card for "gunshot_triangulation_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/10_Categories_8085_Groups_of_Urban_Refined_Management_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
10 Categories – 8,085 Groups of Urban Refined Management Data. The collection scenes include street, snack street, shop entrance, corridor, community entrance, construction site, etc. The data diversity includes multiple scenes, different time periods(day, night), different photographic angles. The urban refined management categories in the images were annotated with rectangular bounding boxesThis data can be used for tasks such as urban refined management.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1092?source=Huggingface
# Specifications
## Data size
10 categories, including 18 subclasses, a group of data contains 2 images from different angles and 1 video
## Collecting environment
including street, snack street, shop entrance, corridor, community entrance, construction site, etc.
## Data diversity
multiple scenes, different time periods, different photographic angles
## Device
cellphone
## Collecting angle
looking down angle
## Collecting time
day, night
## Data format
the image data format is .jpg, the video data format is .mp4, .mov, the annotation file format is .json
## Annotation content
the urban refined management categories in the images were annotated with rectangular bounding boxes
## Accuracy rata
the error bound of each vertex of quadrilateral bounding box is within 3 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 95%; the accuracy of label annotation is not less than 95%
# Licensing Information
Commercial License
|
dembastu/methods2test_raw_grouped_tok | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1110668740
num_examples: 631120
download_size: 361945934
dataset_size: 1110668740
---
# Dataset Card for "methods2test_raw_grouped_tok"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713215463 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 189200
num_examples: 443
download_size: 66132
dataset_size: 189200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_subord_conjunction_doubling | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5255
num_examples: 18
- name: test
num_bytes: 27092
num_examples: 108
- name: train
num_bytes: 32353
num_examples: 128
download_size: 50772
dataset_size: 64700
---
# Dataset Card for "MULTI_VALUE_qqp_subord_conjunction_doubling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GPTGone/hc3_v2 | ---
license: mit
---
# About Dataset
This is an extension of the HC3 Dataset. We added around 25k new ChatGPT responses, which roughly equates to around 25k new rows of data as compared to HC3.
The main dataset is in HC3_With_Scraped_Data.csv
The other files consists of other features such as GLTR scores, perplexity scores etc. |
LIDIA-HESSEN/vencortex-BusinessNewsDataset | ---
dataset_info:
features:
- name: title
dtype: string
- name: image
dtype: string
- name: text
dtype: string
- name: url
dtype: string
- name: type
dtype: string
- name: context_id
dtype: string
- name: source
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 290733891
num_examples: 469361
download_size: 123671926
dataset_size: 290733891
---
# Dataset Card for "BusinessNewsDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SuperMari/supermari | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 18104662.0
num_examples: 17
download_size: 17061196
dataset_size: 18104662.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-c50da3-1597456330 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-350m
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-350m
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
ashwathjadhav23/Spanish_MLM_1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3504255
num_examples: 25000
download_size: 1949854
dataset_size: 3504255
---
# Dataset Card for "Spanish_MLM_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JasiekKaczmarczyk/pianofor-ai-sustain-masked | ---
dataset_info:
features:
- name: midi_filename
dtype: string
- name: source
dtype: string
- name: pitch
sequence: int16
length: 128
- name: start
sequence: float32
length: 128
- name: dstart
sequence: float32
length: 128
- name: duration
sequence: float32
length: 128
- name: velocity
sequence: int16
length: 128
- name: masking_spaces
struct:
- name: <Random Mask>
sequence: bool
length: 128
- name: <LH Mask>
sequence: bool
length: 128
- name: <RH Mask>
sequence: bool
length: 128
- name: <Harmonic Root Mask>
sequence: bool
length: 128
- name: <Harmonic Outliers Mask>
sequence: bool
length: 128
splits:
- name: train
num_bytes: 454163007
num_examples: 189001
- name: validation
num_bytes: 43536465
num_examples: 18262
- name: test
num_bytes: 52054314
num_examples: 21576
download_size: 319101693
dataset_size: 549753786
---
# Dataset Card for "pianofor-ai-sustain-masked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hmao/multiapi_eval_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: fncall
sequence: string
- name: dataset
dtype: string
- name: generated_question
dtype: string
splits:
- name: train
num_bytes: 37075
num_examples: 95
download_size: 17812
dataset_size: 37075
---
# Dataset Card for "multiapi_eval_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kanishka/counterfactual_babylm_keys_to_pipps_2913 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 582987526
num_examples: 11635530
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 422376004
dataset_size: 639107756
---
# Dataset Card for "counterfactual_babylm_keys_to_pipps_2913"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yahir/edits | ---
license: apache-2.0
---
|
heliosprime/twitter_dataset_1713061495 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11206
num_examples: 25
download_size: 8870
dataset_size: 11206
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713061495"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_0x7194633__nanoFialka-v1 | ---
pretty_name: Evaluation run of 0x7194633/nanoFialka-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0x7194633/nanoFialka-v1](https://huggingface.co/0x7194633/nanoFialka-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0x7194633__nanoFialka-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T16:01:50.932005](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7194633__nanoFialka-v1/blob/main/results_2024-01-10T16-01-50.932005.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24949692414008695,\n\
\ \"acc_stderr\": 0.030489858984333953,\n \"acc_norm\": 0.25034227833551215,\n\
\ \"acc_norm_stderr\": 0.031302865499426825,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.4525733674718429,\n\
\ \"mc2_stderr\": 0.015709658694891028\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1757679180887372,\n \"acc_stderr\": 0.011122850863120485,\n\
\ \"acc_norm\": 0.22013651877133106,\n \"acc_norm_stderr\": 0.01210812488346098\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2703644692292372,\n\
\ \"acc_stderr\": 0.004432403734882275,\n \"acc_norm\": 0.28121888070105555,\n\
\ \"acc_norm_stderr\": 0.00448675220043036\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108614,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108614\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n\
\ \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.2013888888888889,\n\
\ \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062949,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062949\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n\
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747548,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.31290322580645163,\n \"acc_stderr\": 0.02637756702864586,\n \"\
acc_norm\": 0.31290322580645163,\n \"acc_norm_stderr\": 0.02637756702864586\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935411,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935411\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.022939925418530623,\n\
\ \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.022939925418530623\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978103,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978103\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119996,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119996\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790208,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790208\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.015162024152278441,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.015162024152278441\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.02456922360046085,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.02456922360046085\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266733,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266733\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927234,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927234\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378974,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378974\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.02599111767281329,\n\
\ \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.02599111767281329\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.03240004825594688,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.03240004825594688\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.4525733674718429,\n\
\ \"mc2_stderr\": 0.015709658694891028\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.014051956064076892\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/0x7194633/nanoFialka-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|arc:challenge|25_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|gsm8k|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hellaswag|10_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-01-50.932005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T16-01-50.932005.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- '**/details_harness|winogrande|5_2024-01-10T16-01-50.932005.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T16-01-50.932005.parquet'
- config_name: results
data_files:
- split: 2024_01_10T16_01_50.932005
path:
- results_2024-01-10T16-01-50.932005.parquet
- split: latest
path:
- results_2024-01-10T16-01-50.932005.parquet
---
# Dataset Card for Evaluation run of 0x7194633/nanoFialka-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0x7194633/nanoFialka-v1](https://huggingface.co/0x7194633/nanoFialka-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0x7194633__nanoFialka-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T16:01:50.932005](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7194633__nanoFialka-v1/blob/main/results_2024-01-10T16-01-50.932005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24949692414008695,
"acc_stderr": 0.030489858984333953,
"acc_norm": 0.25034227833551215,
"acc_norm_stderr": 0.031302865499426825,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.4525733674718429,
"mc2_stderr": 0.015709658694891028
},
"harness|arc:challenge|25": {
"acc": 0.1757679180887372,
"acc_stderr": 0.011122850863120485,
"acc_norm": 0.22013651877133106,
"acc_norm_stderr": 0.01210812488346098
},
"harness|hellaswag|10": {
"acc": 0.2703644692292372,
"acc_stderr": 0.004432403734882275,
"acc_norm": 0.28121888070105555,
"acc_norm_stderr": 0.00448675220043036
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108614,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108614
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.03353647469713839,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.03353647469713839
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062949,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062949
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747548,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708614,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708614
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28717948717948716,
"acc_stderr": 0.022939925418530623,
"acc_norm": 0.28717948717948716,
"acc_norm_stderr": 0.022939925418530623
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978103,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978103
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119996,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119996
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790208,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790208
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278441,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278441
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480768,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266733,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266733
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927234,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927234
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378974,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378974
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.02599111767281329,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.02599111767281329
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594688,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594688
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.4525733674718429,
"mc2_stderr": 0.015709658694891028
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076892
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
eckendoerffer/justice_fr | ---
license: cc-by-sa-4.0
language:
- fr
pretty_name: Law & decision from French justice system
tags:
- justice
- law
- legal
size_categories:
- 100K<n<1M
---
# Dataset Card for French Legal Dataset
## Dataset Description
The dataset contains a comprehensive collection of French legal books, codes, and appellate court decisions. It encompasses the following:
- 150,938 rows -> 140,000 articles of laws, decrees, and orders from the 78 French books and codes, covering all legal domains. The total number of pages is approximately 35,000.
- 191,741 rows -> 53,000 appellate court decisions spanning from 2013 to the present day. The dataset includes a wide range of cases and legal opinions. The total number of pages is approximately 150,000.
- 1,621 rows -> 1,621 definitions and abbreviations commonly used in legal texts, aiding in understanding and interpreting the legal terminology.s used in legal texts.
Total: 344,300 rows
Line breaks are included in the legal texts. They are necessary to maintain the indentations and paragraphs mentioned in other laws and court decisions.
### Languages
The dataset is in French only.
## Dataset Structure
```
{
"instruction":"Code du travail > Partie l\u00e9gislative > Livre III : Les institutions repr\u00e9sentatives du personnel > Titre Ier : Comit\u00e9 social et \u00e9conomique > Chapitre IV : Composition, \u00e9lections et mandat > Section 1 : Composition > Article L2314-2 (Loi)",
"input":"Je souhaite l'Article L2314-2 du Code du travail",
"output":"Voici l'Article L2314-2 du Code du travail: Sous r\u00e9serve des dispositions applicables dans les entreprises de moins de trois cents salari\u00e9s, pr\u00e9vues \u00e0 l'article L. 2143-22, chaque organisation syndicale repr\u00e9sentative dans l'entreprise ou l'\u00e9tablissement peut d\u00e9signer un repr\u00e9sentant syndical au comit\u00e9. Il assiste aux s\u00e9ances avec voix consultative. Il est choisi parmi les membres du personnel de l'entreprise et doit remplir les conditions d'\u00e9ligibilit\u00e9 au comit\u00e9 social et \u00e9conomique fix\u00e9es \u00e0 l'article L. 2314-19."
},
```
### Data Fields
- `instruction`:
- French books and codes -> hierarchy from law text:
"Code pénal > Partie législative > Livre II : Des crimes et délits contre les personnes > Titre II : Des atteintes à la personne humaine > Chapitre Ier : Des atteintes à la vie de la personne > Section 2 : Des atteintes involontaires à la vie > Article 221-6"
- Court decisions -> location, chamber, decision number, decision date, part:
"Cour d'appel de Paris I5, Cour de cassation Chambre commerciale financière et économique, décision 18-13.763 du 14/04/2021, partie 1"
- `input`:
- French books and codes -> questions with multiple variations, such as: "What does Article XX of Code XX say?"
- Court decisions -> empty
- `output`:
- French books and codes -> laws text
- Court decisions -> decisions text
The text has been limited/split to approximately 820 words per row, with an average of 1500 tokens (French -> Falcon tokenizer). The goal is to not exceed 2048 tokens, with a margin of error.
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
- All French codes (PDF): https://www.legifrance.gouv.fr/liste/code?etatTexte=VIGUEUR&etatTexte=VIGUEUR_DIFF
- Court decisions from JUDILIBRE API: https://piste.gouv.fr/index.php?option=com_apiportal&view=apitester&usage=api&apitab=tests&apiName=JUDILIBRE&apiId=b6d2f389-c3ec-4eb3-9075-bc24d0783781&managerId=2&type=rest&apiVersion=1.0.0&Itemid=265&swaggerVersion=2.0&lang=fr
#### Who are the source language producers?
Comming directly from French justice system.
## Additional Information
### Licensing Information
The dataset is available under the Creative Commons Attribution-ShareAlike License
|
open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-5h | ---
pretty_name: Evaluation run of SC56/Mistral-7B-sumz-dpo-5h
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC56/Mistral-7B-sumz-dpo-5h](https://huggingface.co/SC56/Mistral-7B-sumz-dpo-5h)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-5h\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T02:31:37.201577](https://huggingface.co/datasets/open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-5h/blob/main/results_2024-01-28T02-31-37.201577.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6536725481310549,\n\
\ \"acc_stderr\": 0.03217318839707677,\n \"acc_norm\": 0.6532311674900567,\n\
\ \"acc_norm_stderr\": 0.03284372303538653,\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.7235747473554947,\n\
\ \"mc2_stderr\": 0.01467203939730831\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.01334091608524626,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635753\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7211710814578769,\n\
\ \"acc_stderr\": 0.004475067344626756,\n \"acc_norm\": 0.8898625771758614,\n\
\ \"acc_norm_stderr\": 0.00312421161719886\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n\
\ \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n\
\ \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"\
acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531,\n \"acc_norm\"\
: 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.030588697013783642,\n\
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.030588697013783642\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n\
\ \"acc_stderr\": 0.01646320023811452,\n \"acc_norm\": 0.4122905027932961,\n\
\ \"acc_norm_stderr\": 0.01646320023811452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.025218040373410626,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.025218040373410626\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.7235747473554947,\n\
\ \"mc2_stderr\": 0.01467203939730831\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785725\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \
\ \"acc_stderr\": 0.012782681251053198\n }\n}\n```"
repo_url: https://huggingface.co/SC56/Mistral-7B-sumz-dpo-5h
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|arc:challenge|25_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|gsm8k|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hellaswag|10_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-31-37.201577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T02-31-37.201577.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- '**/details_harness|winogrande|5_2024-01-28T02-31-37.201577.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T02-31-37.201577.parquet'
- config_name: results
data_files:
- split: 2024_01_28T02_31_37.201577
path:
- results_2024-01-28T02-31-37.201577.parquet
- split: latest
path:
- results_2024-01-28T02-31-37.201577.parquet
---
# Dataset Card for Evaluation run of SC56/Mistral-7B-sumz-dpo-5h
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC56/Mistral-7B-sumz-dpo-5h](https://huggingface.co/SC56/Mistral-7B-sumz-dpo-5h) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-5h",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T02:31:37.201577](https://huggingface.co/datasets/open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-5h/blob/main/results_2024-01-28T02-31-37.201577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6536725481310549,
"acc_stderr": 0.03217318839707677,
"acc_norm": 0.6532311674900567,
"acc_norm_stderr": 0.03284372303538653,
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.7235747473554947,
"mc2_stderr": 0.01467203939730831
},
"harness|arc:challenge|25": {
"acc": 0.7039249146757679,
"acc_stderr": 0.01334091608524626,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635753
},
"harness|hellaswag|10": {
"acc": 0.7211710814578769,
"acc_stderr": 0.004475067344626756,
"acc_norm": 0.8898625771758614,
"acc_norm_stderr": 0.00312421161719886
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.030588697013783642,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.030588697013783642
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.01646320023811452,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.01646320023811452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.025218040373410626,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.025218040373410626
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657473,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.7235747473554947,
"mc2_stderr": 0.01467203939730831
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785725
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
petricevich/hr_laws | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 56896684
num_examples: 865
download_size: 24891008
dataset_size: 56896684
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp | ---
pretty_name: Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Radu1999/Mistral-Instruct-Ukrainian-slerp](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T11:11:52.976201](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp/blob/main/results_2024-02-12T11-11-52.976201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6128348134449864,\n\
\ \"acc_stderr\": 0.03306039267014507,\n \"acc_norm\": 0.6174798445939971,\n\
\ \"acc_norm_stderr\": 0.033726644979784004,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6348683354452056,\n\
\ \"mc2_stderr\": 0.015251462930296836\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848029,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.01418211986697487\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n\
\ \"acc_stderr\": 0.004750884401095161,\n \"acc_norm\": 0.8434574785899224,\n\
\ \"acc_norm_stderr\": 0.0036262628054422163\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\
\ \"acc_stderr\": 0.025988500792411898,\n \"acc_norm\": 0.7032258064516129,\n\
\ \"acc_norm_stderr\": 0.025988500792411898\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371153,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159696,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159696\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6348683354452056,\n\
\ \"mc2_stderr\": 0.015251462930296836\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41698256254738436,\n \
\ \"acc_stderr\": 0.013581320997216588\n }\n}\n```"
repo_url: https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|arc:challenge|25_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|gsm8k|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hellaswag|10_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T11-11-52.976201.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- '**/details_harness|winogrande|5_2024-02-12T11-11-52.976201.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T11-11-52.976201.parquet'
- config_name: results
data_files:
- split: 2024_02_12T11_11_52.976201
path:
- results_2024-02-12T11-11-52.976201.parquet
- split: latest
path:
- results_2024-02-12T11-11-52.976201.parquet
---
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-slerp](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T11:11:52.976201](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp/blob/main/results_2024-02-12T11-11-52.976201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6128348134449864,
"acc_stderr": 0.03306039267014507,
"acc_norm": 0.6174798445939971,
"acc_norm_stderr": 0.033726644979784004,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6348683354452056,
"mc2_stderr": 0.015251462930296836
},
"harness|arc:challenge|25": {
"acc": 0.5767918088737202,
"acc_stderr": 0.014438036220848029,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.01418211986697487
},
"harness|hellaswag|10": {
"acc": 0.6528579964150567,
"acc_stderr": 0.004750884401095161,
"acc_norm": 0.8434574785899224,
"acc_norm_stderr": 0.0036262628054422163
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411898,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.02515826601686858,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.02515826601686858
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371153,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.02563082497562135,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.02563082497562135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6348683354452056,
"mc2_stderr": 0.015251462930296836
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
},
"harness|gsm8k|5": {
"acc": 0.41698256254738436,
"acc_stderr": 0.013581320997216588
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Thanmay/hellaswag-hi | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
- name: itv2 hi 0
dtype: string
- name: itv2 hi 1
dtype: string
- name: itv2 hi 2
dtype: string
- name: itv2 hi 3
dtype: string
splits:
- name: test
num_bytes: 48075015
num_examples: 10003
- name: validation
num_bytes: 50007155
num_examples: 10042
download_size: 20134375
dataset_size: 98082170
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "hellaswag-hi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
elenanereiss/german-ler | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- de
license:
- cc-by-4.0
multilinguality:
- monolingual
paperswithcode_id: dataset-of-legal-documents
pretty_name: German Named Entity Recognition in Legal Documents
size_categories:
- 1M<n<10M
source_datasets:
- original
tags:
- ner, named entity recognition, legal ner, legal texts, label classification
task_categories:
- token-classification
task_ids:
- named-entity-recognition
train-eval-index:
- config: conll2003
task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: test
col_mapping:
tokens: tokens
ner_tags: tags
---
# Dataset Card for "German LER"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/elenanereiss/Legal-Entity-Recognition](https://github.com/elenanereiss/Legal-Entity-Recognition)
- **Paper:** [https://arxiv.org/pdf/2003.13016v1.pdf](https://arxiv.org/pdf/2003.13016v1.pdf)
- **Point of Contact:** [elena.leitner@dfki.de](elena.leitner@dfki.de)
### Dataset Summary
A dataset of Legal Documents from German federal court decisions for Named Entity Recognition. The dataset is human-annotated with 19 fine-grained entity classes. The dataset consists of approx. 67,000 sentences and contains 54,000 annotated entities. NER tags use the `BIO` tagging scheme.
The dataset includes two different versions of annotations, one with a set of 19 fine-grained semantic classes (`ner_tags`) and another one with a set of 7 coarse-grained classes (`ner_coarse_tags`). There are 53,632 annotated entities in total, the majority of which (74.34 %) are legal entities, the others are person, location and organization (25.66 %).

For more details see [https://arxiv.org/pdf/2003.13016v1.pdf](https://arxiv.org/pdf/2003.13016v1.pdf).
### Supported Tasks and Leaderboards
- **Tasks:** Named Entity Recognition
- **Leaderboards:**
### Languages
German
## Dataset Structure
### Data Instances
```python
{
'id': '1',
'tokens': ['Eine', 'solchermaßen', 'verzögerte', 'oder', 'bewusst', 'eingesetzte', 'Verkettung', 'sachgrundloser', 'Befristungen', 'schließt', '§', '14', 'Abs.', '2', 'Satz', '2', 'TzBfG', 'aus', '.'],
'ner_tags': [38, 38, 38, 38, 38, 38, 38, 38, 38, 38, 3, 22, 22, 22, 22, 22, 22, 38, 38],
'ner_coarse_tags': [14, 14, 14, 14, 14, 14, 14, 14, 14, 14, 2, 9, 9, 9, 9, 9, 9, 14, 14]
}
```
### Data Fields
```python
{
'id': Value(dtype='string', id=None),
'tokens': Sequence(feature=Value(dtype='string', id=None),
length=-1, id=None),
'ner_tags': Sequence(feature=ClassLabel(num_classes=39,
names=['B-AN',
'B-EUN',
'B-GRT',
'B-GS',
'B-INN',
'B-LD',
'B-LDS',
'B-LIT',
'B-MRK',
'B-ORG',
'B-PER',
'B-RR',
'B-RS',
'B-ST',
'B-STR',
'B-UN',
'B-VO',
'B-VS',
'B-VT',
'I-AN',
'I-EUN',
'I-GRT',
'I-GS',
'I-INN',
'I-LD',
'I-LDS',
'I-LIT',
'I-MRK',
'I-ORG',
'I-PER',
'I-RR',
'I-RS',
'I-ST',
'I-STR',
'I-UN',
'I-VO',
'I-VS',
'I-VT',
'O'],
id=None),
length=-1,
id=None),
'ner_coarse_tags': Sequence(feature=ClassLabel(num_classes=15,
names=['B-LIT',
'B-LOC',
'B-NRM',
'B-ORG',
'B-PER',
'B-REG',
'B-RS',
'I-LIT',
'I-LOC',
'I-NRM',
'I-ORG',
'I-PER',
'I-REG',
'I-RS',
'O'],
id=None),
length=-1,
id=None)
}
```
### Data Splits
| | train | validation | test |
|-------------------------|------:|-----------:|-----:|
| Input Sentences | 53384 | 6666 | 6673 |
## Dataset Creation
### Curation Rationale
Documents in the legal domain contain multiple references to named entities, especially domain-specific named entities, i. e., jurisdictions, legal institutions, etc. Legal documents are unique and differ greatly from newspaper texts. On the one hand, the occurrence of general-domain named entities is relatively rare. On the other hand, in concrete applications, crucial domain-specific entities need to be identified in a reliable way, such as designations of legal norms and references to other legal documents (laws, ordinances, regulations, decisions, etc.). Most NER solutions operate in the general or news domain, which makes them inapplicable to the analysis of legal documents. Accordingly, there is a great need for an NER-annotated dataset consisting of legal documents, including the corresponding development of a typology of semantic concepts and uniform annotation guidelines.
### Source Data
Court decisions from 2017 and 2018 were selected for the dataset, published online by the [Federal Ministry of Justice and Consumer Protection](http://www.rechtsprechung-im-internet.de). The documents originate from seven federal courts: Federal Labour Court (BAG), Federal Fiscal Court (BFH), Federal Court of Justice (BGH), Federal Patent Court (BPatG), Federal Social Court (BSG), Federal Constitutional Court (BVerfG) and Federal Administrative Court (BVerwG).
#### Initial Data Collection and Normalization
From the table of [contents](http://www.rechtsprechung-im-internet.de/rii-toc.xml), 107 documents from each court were selected (see Table 1). The data was collected from the XML documents, i. e., it was extracted from the XML elements `Mitwirkung, Titelzeile, Leitsatz, Tenor, Tatbestand, Entscheidungsgründe, Gründen, abweichende Meinung, and sonstiger Titel`. The metadata at the beginning of the documents (name of court, date of decision, file number, European Case Law Identifier, document type, laws) and those that belonged to previous legal proceedings was deleted. Paragraph numbers were removed.
The extracted data was split into sentences, tokenised using [SoMaJo](https://github.com/tsproisl/SoMaJo) and manually annotated in [WebAnno](https://webanno.github.io/webanno/).
#### Who are the source language producers?
The Federal Ministry of Justice and the Federal Office of Justice provide selected decisions. Court decisions were produced by humans.
### Annotations
#### Annotation process
For more details see [annotation guidelines](https://github.com/elenanereiss/Legal-Entity-Recognition/blob/master/docs/Annotationsrichtlinien.pdf) (in German).
<!-- #### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)-->
### Personal and Sensitive Information
A fundamental characteristic of the published decisions is that all personal information have been anonymised for privacy reasons. This affects the classes person, location and organization.
<!-- ## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)-->
### Licensing Information
[CC BY-SA 4.0 license](https://creativecommons.org/licenses/by-sa/4.0/)
### Citation Information
```
@misc{https://doi.org/10.48550/arxiv.2003.13016,
doi = {10.48550/ARXIV.2003.13016},
url = {https://arxiv.org/abs/2003.13016},
author = {Leitner, Elena and Rehm, Georg and Moreno-Schneider, Julián},
keywords = {Computation and Language (cs.CL), Information Retrieval (cs.IR), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {A Dataset of German Legal Documents for Named Entity Recognition},
publisher = {arXiv},
year = {2020},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
|
AdapterOcean/biology_dataset_standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 46233919
num_examples: 4054
download_size: 0
dataset_size: 46233919
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roblab/olis | ---
license: wtfpl
---
|
FINNUMBER/FINCH_TRAIN_NQA_COM_400 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1135961
num_examples: 400
download_size: 634970
dataset_size: 1135961
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/georg_thiele_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of georg_thiele/ゲオルク・ティーレ/Z2 (Azur Lane)
This is the dataset of georg_thiele/ゲオルク・ティーレ/Z2 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `bangs, red_eyes, long_hair, braid, black_hair, brown_hair, hat, beret, bow, hair_bun, red_bow, single_hair_bun, single_side_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 12.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 7.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 18 | 13.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 10.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 18 | 18.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/georg_thiele_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, full_body, long_sleeves, obi, closed_mouth, simple_background, sitting, standing, white_background, wide_sleeves, barefoot, black_footwear, boots, candy_apple, holding_food, jacket, striped_kimono, yukata |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | full_body | long_sleeves | obi | closed_mouth | simple_background | sitting | standing | white_background | wide_sleeves | barefoot | black_footwear | boots | candy_apple | holding_food | jacket | striped_kimono | yukata |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:------------|:---------------|:------|:---------------|:--------------------|:----------|:-----------|:-------------------|:---------------|:-----------|:-----------------|:--------|:--------------|:---------------|:---------|:-----------------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
classla/FRENK-hate-hr | ---
language:
- hr
license:
- other
size_categories:
- 1K<n<10K
task_categories:
- text-classification
task_ids: []
tags:
- hate-speech-detection
- offensive-language
---
# Offensive language dataset of Croatian comments FRENK 1.0
Croatian subset of the [FRENK dataset](http://hdl.handle.net/11356/1433). Also available on HuggingFace dataset hub: [English subset](https://huggingface.co/datasets/5roop/FRENK-hate-en), [Slovenian subset](https://huggingface.co/datasets/5roop/FRENK-hate-sl).
## Dataset Description
- **Homepage:** http://hdl.handle.net/11356/1433
- **Repository:** http://hdl.handle.net/11356/1433
- **Paper:** https://arxiv.org/abs/1906.02045
- **Project page** https://nl.ijs.si/frenk/
## Description of the original dataset
>The original FRENK dataset consists of comments to Facebook posts (news articles) of mainstream media outlets from Croatia, Great Britain, and Slovenia, on the topics of migrants and LGBT. The dataset contains whole discussion threads. Each comment is annotated by the type of socially unacceptable discourse (e.g., inappropriate, offensive, violent speech) and its target (e.g., migrants/LGBT, commenters, media). The annotation schema is described in detail in [https://arxiv.org/pdf/1906.02045.pdf]. Usernames in the metadata are pseudo-anonymised and removed from the comments.
>
>The data in each language (Croatian (hr), English (en), Slovenian (sl), and topic (migrants, LGBT) is divided into a training and a testing portion. The training and testing data consist of separate discussion threads, i.e., there is no cross-discussion-thread contamination between training and testing data. The sizes of the splits are the following: Croatian, migrants: 4356 training comments, 978 testing comments; Croatian LGBT: 4494 training comments, 1142 comments; English, migrants: 4540 training comments, 1285 testing comments; English, LGBT: 4819 training comments, 1017 testing comments; Slovenian, migrants: 5145 training comments, 1277 testing comments; Slovenian, LGBT: 2842 training comments, 900 testing comments.
For this dataset only the Croatian data was used. Training segment has been split into beginning 90% (published here as training split) and end 10% (published here as dev split). Test segment has been preserved in its original form.
## Usage in `Transformers`
```python
import datasets
ds = datasets.load_dataset("classla/FRENK-hate-hr","binary")
```
For binary classification the following encoding is used:
```python
_CLASS_MAP_BINARY = {
'Acceptable': 0,
'Offensive': 1,
}
```
The original labels are available if the dataset is loaded with the `multiclass` option:
```python
import datasets
ds = datasets.load_dataset("classla/FRENK-hate-hr","multiclass").
```
In this case the encoding used is:
```python
_CLASS_MAP_MULTICLASS = {
'Acceptable speech': 0,
'Inappropriate': 1,
'Background offensive': 2,
'Other offensive': 3,
'Background violence': 4,
'Other violence': 5,
}
```
## Data structure
* `text`: text
* `target`: who is the target of the hate-speech text ("no target", "commenter", "target" (migrants or LGBT, depending on the topic), or "related to" (again, the topic))
* `topic`: whether the text relates to lgbt or migrants hate-speech domains
* `label`: label of the text instance, see above.
## Data instance
```
{'text': 'Potpisujem komentar g ankice pavicic',
'target': 'No target',
'topic': 'lgbt',
'label': 0}
```
## Licensing information
CLARIN.SI Licence ACA ID-BY-NC-INF-NORED 1.0
## Citation information
When using this dataset please cite the following paper:
```
@misc{ljubešić2019frenk,
title={The FRENK Datasets of Socially Unacceptable Discourse in Slovene and English},
author={Nikola Ljubešić and Darja Fišer and Tomaž Erjavec},
year={2019},
eprint={1906.02045},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/1906.02045}
}
```
The original dataset can be cited as
```
@misc{11356/1433,
title = {Offensive language dataset of Croatian, English and Slovenian comments {FRENK} 1.0},
author = {Ljube{\v s}i{\'c}, Nikola and Fi{\v s}er, Darja and Erjavec, Toma{\v z}},
url = {http://hdl.handle.net/11356/1433},
note = {Slovenian language resource repository {CLARIN}.{SI}},
copyright = {{CLARIN}.{SI} Licence {ACA} {ID}-{BY}-{NC}-{INF}-{NORED} 1.0},
year = {2021} }
``` |
mask-distilled-one-sec-cv12/chunk_45 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1209553680
num_examples: 237540
download_size: 1229354036
dataset_size: 1209553680
---
# Dataset Card for "chunk_45"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malhajar/alpaca-gpt4-ar | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
- name: instruction-arabic
dtype: string
- name: input-arabic
dtype: string
- name: output-arabic
dtype: string
- name: text-arabic
dtype: string
splits:
- name: train
num_bytes: 219104037
num_examples: 52000
download_size: 108566377
dataset_size: 219104037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LambdaX-AI/sectionHclausesrobertaEmbeddings | ---
dataset_info:
features:
- name: clause_number
dtype: string
- name: clause_title
dtype: string
- name: clause_text
dtype: string
- name: emb
sequence: float64
splits:
- name: train
num_bytes: 869302
num_examples: 102
download_size: 885535
dataset_size: 869302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sectionHclausesrobertaEmbeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jbilcke-hf/ai-tube-neurogorgon | ---
license: cc-by-nc-sa-4.0
pretty_name: Neurogorgon
---
## Description
Gameplay footage of various latent games!
## Model
SVD
## LoRA
veryVANYA/ps1-graphics-sdxl-v2
## Tags
- Gaming
## Voice
Cloée
## Music
Balearic deep house music
## Prompt
A video channel managed by Athena, a famous 28yo gaming influencer.
It generates gameplay video sessions of various unknown, strange or invented videogames (original stories, and not copies of existing franchises).
|
HelgeKn/SemEval_categories | ---
license: apache-2.0
language:
- en
size_categories:
- n<1K
--- |
feedexpdition/FinancialTickets | ---
license: mit
---
|
tollefj/nordic-ner | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 38152132
num_examples: 161379
- name: validation
num_bytes: 10359916
num_examples: 48470
- name: test
num_bytes: 11040741
num_examples: 50498
download_size: 15450325
dataset_size: 59552789
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
Created from the three datasets `wikiann`, `norne`, `dane`, and `KBLab/sucx3_ner`.
See detailed config below:
```python
dataset_ids = [
"wikiann",
"dane",
"norne",
"KBLab/sucx3_ner"
]
dataset_subsets = {
"wikiann": [
"nn", "no", "da", "sv", "fo", "is"
],
"dane": [None],
"norne": ["combined-7"],
"KBLab/sucx3_ner": ["original_cased"]
}
```
Unified to the following BIO-scheme:
```
# O: 0
# B-PER: 1
# I-PER: 2
# B-ORG: 3
# I-ORG: 4
# B-LOC: 5
# I-LOC: 6
# B-MISC: 7
# I-MISC: 8
mappers = {
"norne": {
0: 0,
1: 1,
2: 2,
3: 3,
4: 4,
# PROD->MISC
5: 7,
6: 8,
# LOC -> LOC
7: 5,
8: 6,
# DRV -> MISC (names, but derived)
9: 7,
10: 8,
# EVT -> MISC (events)
11: 7,
12: 8,
# MISC -> MISC
13: 7,
14: 8,
},
"KBLab/sucx3_ner": {
"O": 0,
# PER
"B-person": 1,
"I-person": 2,
# LOC
"B-place": 5,
"I-place": 6,
# ORG
"I-inst": 4,
"B-inst": 3,
# MISC
# this is considered a 'work' by someone or something
"B-work": 7,
"I-work": 8,
"B-animal": 7,
"I-animal": 8,
"B-product": 7,
"I-product": 8,
"B-event": 7,
"I-event": 8,
"B-other": 7,
"I-other": 8,
# mythological
"B-myth": 7,
"I-myth": 8,
},
}
```
|
SGBTalha/FelipeEspanhol | ---
license: openrail
---
|
nuvocare/MSD_instruct | ---
dataset_info:
features:
- name: User
dtype: string
- name: Category
dtype: string
- name: Language
dtype:
class_label:
names:
'0': english
'1': french
'2': german
'3': spanish
- name: Topic1
dtype: string
- name: Topic2
dtype: string
- name: Topic3
dtype: string
- name: Text
dtype: string
- name: Question
dtype: string
splits:
- name: train
num_bytes: 133987453
num_examples: 79898
- name: test
num_bytes: 44598046
num_examples: 26639
download_size: 107452363
dataset_size: 178585499
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- de
- es
- fr
- en
tags:
- medical
size_categories:
- 10K<n<100K
---
# MSD_manual_topics_user_base
This dataset has been built with the website https://www.msdmanuals.com/ provided by Merck & Co for the greater audience.
The MSD manual is an essential source of knowledge for many topics related to symptoms, diseases, health and other related topics. The manual makes an extra effort to make it available both for professionals and patients by having two distinct version.
The content, while being labelled the same, differs by the type of user in order to facilitate understanding for patients or give clear details for professional. The manual is available in different languages.
This dataset focuses on spanish, german, english and french content about health topics and symptoms. The content is tagged by 2 to 3 medical topics and flagged by user's type and languages.
It consists of roughly 21M words representing 45M tokens.
This dataset is built for instruction fine-tuning. We built the "Question" by querying a vanilla Mistral 7B model with the following prompt:
```python
You will be asked to create one or several questions in the appropriate language based on three elements. Return the ouptuts in the format of the examples. If asked several, splits the answers with a "&" sign.
Example input:
For question 1 : elements are musculoskeletal and connective tissue disorders, Autoimmune Myositis and Diagnosis of Autoimmune Myositis and language is english
Example output:
["Question 1", "musculoskeletal and connective tissue disorders, Autoimmune Myositis and Diagnosis of Autoimmune Myositis", "English", "How to diagnose a autoimmune Myositis ? "]
Example input:
For question 514 : elements are troubles cardiaques et vasculaires, Bloc auriculoventriculaire and Introduction and language is french
Example output:
["Question 514", "troubles cardiaques et vasculaires, Bloc auriculoventriculaire and Introduction", "French", "Donne moi des informations introductives sur le bloc auriculoventriculaire."]
Example input:
For question 514 : elements are troubles cardiaques et vasculaires, Bloc auriculoventriculaire and Introduction and language is french For question 1 : elements are musculoskeletal and connective tissue disorders, Autoimmune Myositis and Diagnosis of Autoimmune Myositis and language is english
Example output:
["Question 514", "troubles cardiaques et vasculaires, Bloc auriculoventriculaire and Introduction", "French", "Donne moi des informations introductives sur le bloc auriculoventriculaire."] & ["Question 1", "musculoskeletal and connective tissue disorders, Autoimmune Myositis and Diagnosis of Autoimmune Myositis", "English", "How to diagnose a autoimmune Myositis ? "]
[/INST]
```
This dataset can be used to fine-tune a model to a task of supproting patients and clinicians to be better informed in an adapted manner.
An instruct-free version is available here : https://huggingface.co/datasets/nuvocare/MSD_manual_topics_user_base
This dataset is built using the website : https://www.msdmanuals.com/ provided by Merck & Co.
All credits of the contents are for the MSD organization.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harsha28/legal-reasoning-lfqa-merged | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 28997933
num_examples: 15000
- name: validation
num_bytes: 2914456
num_examples: 1500
- name: test
num_bytes: 2912255
num_examples: 1500
download_size: 14621330
dataset_size: 34824644
---
# Dataset Card for "legal-reasoning-lfqa-merged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
McSpicyWithMilo/target-elements-0.2split-new-delete-180 | ---
dataset_info:
features:
- name: target_element
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 12696.8
num_examples: 144
- name: test
num_bytes: 3174.2
num_examples: 36
download_size: 11436
dataset_size: 15871.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "target-elements-0.2split-new-delete-180"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/databricks_dolly_15k_ru | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 22121608
num_examples: 14914
download_size: 11365356
dataset_size: 22121608
---
# Dataset Card for "databricks_dolly_15k_ru"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MichelBartels/generated-qa-dataset-3 | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
list:
- name: start
dtype: int64
- name: text
dtype: string
- name: id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 2762
num_examples: 3
download_size: 9393
dataset_size: 2762
---
# Dataset Card for "generated-qa-dataset-3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
winglian/chatlogs-en-cleaned | ---
task_categories:
- text-generation
language:
- en
pretty_name: chatlogs cleaned (en)
size_categories:
- 10K<n<100K
--- |
benayas/massive_chatgpt_5pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 790924
num_examples: 11514
download_size: 271700
dataset_size: 790924
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adityarra07/train_17000 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 2265737853.4844837
num_examples: 17000
- name: test
num_bytes: 26655739.452758636
num_examples: 200
download_size: 2265471038
dataset_size: 2292393592.9372425
---
# Dataset Card for "train_17000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aamirsea/aamir-huggingface | ---
license: llama2
---
|
CyberHarem/ayase_honoka_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ayase_honoka (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ayase_honoka (THE iDOLM@STER: Cinderella Girls), containing 139 images and their tags.
The core tags of this character are `brown_hair, brown_eyes, breasts, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 139 | 153.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_honoka_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 139 | 93.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_honoka_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 330 | 200.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_honoka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 139 | 138.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_honoka_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 330 | 272.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayase_honoka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ayase_honoka_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, smile, white_background, simple_background, open_mouth, black_pantyhose, serafuku, skirt |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, solo, large_breasts, necklace, open_mouth, yellow_eyes, blush, collarbone, single_hair_bun, bangs, strapless_dress, :d, hair_flower, medium_breasts, sidelocks |
| 2 | 7 |  |  |  |  |  | 1girl, ponytail, solo, blush, hair_scrunchie, looking_at_viewer, smile, yellow_eyes, bangs, collarbone, open_mouth, blue_shirt, leg_up, leggings, medium_breasts, pantyhose, shorts, simple_background, standing_split, sweat, tied_shirt, armpits, arms_up, short_sleeves, white_background |
| 3 | 10 |  |  |  |  |  | 1girl, cat_ears, solo, neck_bell, paw_gloves, cat_paws, looking_at_viewer, bare_shoulders, blush, cat_tail, elbow_gloves, ribbon, cleavage, fishnets, garter_straps, jingle_bell, open_mouth, pink_bow, smile, thighhighs, collar, dress, halloween |
| 4 | 8 |  |  |  |  |  | 1girl, card_(medium), character_name, gem_(symbol), solo, star_(symbol), open_mouth, smile, hair_flower, blue_background, dress, microphone, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | smile | white_background | simple_background | open_mouth | black_pantyhose | serafuku | skirt | bare_shoulders | cleavage | large_breasts | necklace | yellow_eyes | collarbone | single_hair_bun | bangs | strapless_dress | :d | hair_flower | medium_breasts | sidelocks | ponytail | hair_scrunchie | blue_shirt | leg_up | leggings | pantyhose | shorts | standing_split | sweat | tied_shirt | armpits | arms_up | short_sleeves | cat_ears | neck_bell | paw_gloves | cat_paws | cat_tail | elbow_gloves | ribbon | fishnets | garter_straps | jingle_bell | pink_bow | thighhighs | collar | dress | halloween | card_(medium) | character_name | gem_(symbol) | star_(symbol) | blue_background | microphone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------|:-------------------|:--------------------|:-------------|:------------------|:-----------|:--------|:-----------------|:-----------|:----------------|:-----------|:--------------|:-------------|:------------------|:--------|:------------------|:-----|:--------------|:-----------------|:------------|:-----------|:-----------------|:-------------|:---------|:-----------|:------------|:---------|:-----------------|:--------|:-------------|:----------|:----------|:----------------|:-----------|:------------|:-------------|:-----------|:-----------|:---------------|:---------|:-----------|:----------------|:--------------|:-----------|:-------------|:---------|:--------|:------------|:----------------|:-----------------|:---------------|:----------------|:------------------|:-------------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | X | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.