datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
autoevaluate/autoeval-staging-eval-project-976d13e6-0b05-475e-9b4e-e8fbc174cfae-346 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad
eval_info:
task: extractive_question_answering
model: autoevaluate/extractive-question-answering
metrics: []
dataset_name: squad
dataset_config: plain_text
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: autoevaluate/extractive-question-answering
* Dataset: squad
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
CreitinGameplays/small-chat-assistant-for-bloom | ---
license: mit
---
# Info
This dataset was generated by ChatGPT and its intended use is for finetune language models. |
TrainingDataPro/MacBook-Attacks-Dataset | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
language:
- en
tags:
- finance
dataset_info:
features:
- name: file
dtype: string
- name: phone
dtype: string
- name: computer
dtype: string
- name: gender
dtype: string
- name: age
dtype: int16
- name: country
dtype: string
splits:
- name: train
num_bytes: 1418
num_examples: 24
download_size: 573934283
dataset_size: 1418
---
# Antispoofing Replay Dataset
The dataset consists of videos of replay attacks played on different models of MacBooks. The dataset solves tasks in the field of anti-spoofing and it is useful for buisness and safety systems.
The dataset includes: **replay attacks** - videos of real people played on a computer and filmed on the phone.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=MacBook-Attacks-Dataset) to discuss your requirements, learn about the price and buy the dataset.
# Content
The folder "attacks" includes videos of replay attack
### Models of MacBooks in the datset:
- MacBook 13
- MacBook Air
- MacBook Air 7
- MacBook Air 11
- MacBook Air 13
- MacBook Air M1
- MacBook Pro 12
- MacBook Pro 13
### File with the extension .csv
includes the following information for each media file:
- **file**: link to access the replay video,
- **phone**: the device used to capture the replay video,
- **computer**: the device used to play the video,
- **gender**: gender of a person in the video,
- **age**: age of the person in the video,
- **country**: country of the person
## [**TrainingData**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=MacBook-Attacks-Dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
chikino/DEADPOOL3 | ---
license: openrail
---
|
open-llm-leaderboard/details_Fizzarolli__sappha-2b-v3 | ---
pretty_name: Evaluation run of Fizzarolli/sappha-2b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fizzarolli/sappha-2b-v3](https://huggingface.co/Fizzarolli/sappha-2b-v3) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fizzarolli__sappha-2b-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T14:34:41.283293](https://huggingface.co/datasets/open-llm-leaderboard/details_Fizzarolli__sappha-2b-v3/blob/main/results_2024-03-24T14-34-41.283293.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38770836928908536,\n\
\ \"acc_stderr\": 0.03401202691311986,\n \"acc_norm\": 0.39301680909612896,\n\
\ \"acc_norm_stderr\": 0.03490947267798798,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731613,\n \"mc2\": 0.3993902530198297,\n\
\ \"mc2_stderr\": 0.014276014222438483\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.447098976109215,\n \"acc_stderr\": 0.014529380160526848,\n\
\ \"acc_norm\": 0.4616040955631399,\n \"acc_norm_stderr\": 0.01456824555029636\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5266879107747461,\n\
\ \"acc_stderr\": 0.004982668452118941,\n \"acc_norm\": 0.707329217287393,\n\
\ \"acc_norm_stderr\": 0.004540586983229991\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.02339382650048487,\n \"acc_norm\"\
: 0.291005291005291,\n \"acc_norm_stderr\": 0.02339382650048487\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3967741935483871,\n\
\ \"acc_stderr\": 0.027831231605767955,\n \"acc_norm\": 0.3967741935483871,\n\
\ \"acc_norm_stderr\": 0.027831231605767955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4292929292929293,\n\
\ \"acc_stderr\": 0.035265527246011986,\n \"acc_norm\": 0.4292929292929293,\n\
\ \"acc_norm_stderr\": 0.035265527246011986\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.5025906735751295,\n \"acc_stderr\": 0.03608390745384488,\n\
\ \"acc_norm\": 0.5025906735751295,\n \"acc_norm_stderr\": 0.03608390745384488\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.02394672474156398,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.02394672474156398\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.48440366972477067,\n \"acc_stderr\": 0.02142689153920805,\n \"\
acc_norm\": 0.48440366972477067,\n \"acc_norm_stderr\": 0.02142689153920805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.03179876342176851,\n \"\
acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03179876342176851\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.48945147679324896,\n \"acc_stderr\": 0.032539983791662855,\n \
\ \"acc_norm\": 0.48945147679324896,\n \"acc_norm_stderr\": 0.032539983791662855\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n\
\ \"acc_stderr\": 0.03298574607842821,\n \"acc_norm\": 0.4080717488789238,\n\
\ \"acc_norm_stderr\": 0.03298574607842821\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.0418644516301375,\n\
\ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.0418644516301375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.045291468044357915,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.045291468044357915\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5769230769230769,\n\
\ \"acc_stderr\": 0.032366121762202014,\n \"acc_norm\": 0.5769230769230769,\n\
\ \"acc_norm_stderr\": 0.032366121762202014\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.49680715197956576,\n\
\ \"acc_stderr\": 0.01787959894593307,\n \"acc_norm\": 0.49680715197956576,\n\
\ \"acc_norm_stderr\": 0.01787959894593307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.026483392042098177,\n\
\ \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.026483392042098177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210749,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210749\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3987138263665595,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.3987138263665595,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422697,\n\
\ \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422697\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32333767926988266,\n\
\ \"acc_stderr\": 0.011946565758447212,\n \"acc_norm\": 0.32333767926988266,\n\
\ \"acc_norm_stderr\": 0.011946565758447212\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3872549019607843,\n \"acc_stderr\": 0.01970687580408563,\n \
\ \"acc_norm\": 0.3872549019607843,\n \"acc_norm_stderr\": 0.01970687580408563\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.45454545454545453,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.40298507462686567,\n\
\ \"acc_stderr\": 0.034683432951111266,\n \"acc_norm\": 0.40298507462686567,\n\
\ \"acc_norm_stderr\": 0.034683432951111266\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.03823727092882307,\n\
\ \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.03823727092882307\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731613,\n \"mc2\": 0.3993902530198297,\n\
\ \"mc2_stderr\": 0.014276014222438483\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6550907655880032,\n \"acc_stderr\": 0.013359379805033676\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674326\n }\n}\n```"
repo_url: https://huggingface.co/Fizzarolli/sappha-2b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|arc:challenge|25_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|gsm8k|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hellaswag|10_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-34-41.283293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T14-34-41.283293.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- '**/details_harness|winogrande|5_2024-03-24T14-34-41.283293.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T14-34-41.283293.parquet'
- config_name: results
data_files:
- split: 2024_03_24T14_34_41.283293
path:
- results_2024-03-24T14-34-41.283293.parquet
- split: latest
path:
- results_2024-03-24T14-34-41.283293.parquet
---
# Dataset Card for Evaluation run of Fizzarolli/sappha-2b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Fizzarolli/sappha-2b-v3](https://huggingface.co/Fizzarolli/sappha-2b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fizzarolli__sappha-2b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T14:34:41.283293](https://huggingface.co/datasets/open-llm-leaderboard/details_Fizzarolli__sappha-2b-v3/blob/main/results_2024-03-24T14-34-41.283293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38770836928908536,
"acc_stderr": 0.03401202691311986,
"acc_norm": 0.39301680909612896,
"acc_norm_stderr": 0.03490947267798798,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731613,
"mc2": 0.3993902530198297,
"mc2_stderr": 0.014276014222438483
},
"harness|arc:challenge|25": {
"acc": 0.447098976109215,
"acc_stderr": 0.014529380160526848,
"acc_norm": 0.4616040955631399,
"acc_norm_stderr": 0.01456824555029636
},
"harness|hellaswag|10": {
"acc": 0.5266879107747461,
"acc_stderr": 0.004982668452118941,
"acc_norm": 0.707329217287393,
"acc_norm_stderr": 0.004540586983229991
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.02339382650048487,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.02339382650048487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3967741935483871,
"acc_stderr": 0.027831231605767955,
"acc_norm": 0.3967741935483871,
"acc_norm_stderr": 0.027831231605767955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.035265527246011986,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.035265527246011986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5025906735751295,
"acc_stderr": 0.03608390745384488,
"acc_norm": 0.5025906735751295,
"acc_norm_stderr": 0.03608390745384488
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.02394672474156398,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.02394672474156398
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48440366972477067,
"acc_stderr": 0.02142689153920805,
"acc_norm": 0.48440366972477067,
"acc_norm_stderr": 0.02142689153920805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03179876342176851,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03179876342176851
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.48945147679324896,
"acc_stderr": 0.032539983791662855,
"acc_norm": 0.48945147679324896,
"acc_norm_stderr": 0.032539983791662855
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842821,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842821
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.0418644516301375,
"acc_norm": 0.3511450381679389,
"acc_norm_stderr": 0.0418644516301375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.047500773411999854,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.047500773411999854
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.032366121762202014,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.032366121762202014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.49680715197956576,
"acc_stderr": 0.01787959894593307,
"acc_norm": 0.49680715197956576,
"acc_norm_stderr": 0.01787959894593307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210749,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210749
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3987138263665595,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.3987138263665595,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422697,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32333767926988266,
"acc_stderr": 0.011946565758447212,
"acc_norm": 0.32333767926988266,
"acc_norm_stderr": 0.011946565758447212
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3872549019607843,
"acc_stderr": 0.01970687580408563,
"acc_norm": 0.3872549019607843,
"acc_norm_stderr": 0.01970687580408563
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.40298507462686567,
"acc_stderr": 0.034683432951111266,
"acc_norm": 0.40298507462686567,
"acc_norm_stderr": 0.034683432951111266
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5380116959064327,
"acc_stderr": 0.03823727092882307,
"acc_norm": 0.5380116959064327,
"acc_norm_stderr": 0.03823727092882307
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731613,
"mc2": 0.3993902530198297,
"mc2_stderr": 0.014276014222438483
},
"harness|winogrande|5": {
"acc": 0.6550907655880032,
"acc_stderr": 0.013359379805033676
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674326
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anjunhu/naively_captioned_CUB2002011_test_20shot | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_cupl
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 110186062.0
num_examples: 4000
download_size: 99101657
dataset_size: 110186062.0
---
# Dataset Card for "naively_captioned_CUB2002011_test_20shot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gshireesh/mnist_fonts | ---
license: mit
---
|
Amirkid/MedQuad-dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 21658852
num_examples: 32800
download_size: 8756796
dataset_size: 21658852
---
# Dataset Card for "MedQuad-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pccl-org/formal-logic-simple-order-token-objects-paired-relationship-0-100 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 1041480
num_examples: 4950
download_size: 114269
dataset_size: 1041480
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.032823 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://assets.genius.com/images/default_avatar_300.png?1631203230')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Krept & Konan, Bugzy Malone, SL, Morisson, Abra Cadabra, RV & Snap Capone</div>
<a href="https://genius.com/artists/krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone">
<div style="text-align: center; font-size: 14px;">@krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|1| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/krept-and-konan-bugzy-malone-sl-morisson-abra-cadabra-rv-and-snap-capone")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
MaxYuki/RyotaSakurabaAI | ---
license: apache-2.0
---
|
tr416/dataset_20231006_202109 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73882
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_202109"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MaxReynolds/cifar10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype:
class_label:
names:
'0': airplane
'1': automobile
'2': bird
'3': cat
'4': deer
'5': dog
'6': frog
'7': horse
'8': ship
'9': truck
splits:
- name: train
num_bytes: 113648310.0
num_examples: 50000
download_size: 119708256
dataset_size: 113648310.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cifar10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thaweewat/hc3-24k-th | ---
license: cc-by-sa-3.0
task_categories:
- question-answering
- summarization
tags:
- instruction-finetuning
language:
- th
size_categories:
- 10K<n<100K
---
# Summary
This is a 🇹🇭 Thai-instructed dataset translated using Google Cloud Translation from [HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3)
( Included total **24K**, 17K reddit_eli5, 4K finance, 1.2K medicine, 1.2K open_qa and 0.8K wiki_csai )
The first human-ChatGPT comparison corpus which is introduced in this paper:
- [How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection](https://arxiv.org/abs/2301.07597)
Code, models and analysis are available on GitHub:
- GitHub: [Chatgpt-Comparison-Detection project 🔬](https://github.com/Hello-SimpleAI/chatgpt-comparison-detection)
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Thai
Version: 1.0
---
|
cdoswald/SPIDER | ---
license: cc-by-4.0
language:
- en
tags:
- medical
- MRI
- spine
- image segmentation
- computer vision
size_categories:
- n<1K
pretty_name: 'SPIDER: Spine MRI Segmentation'
task_categories:
- image-segmentation
- mask-generation
---
# Spine Segmentation: Discs, Vertebrae and Spinal Canal (SPIDER)
The SPIDER dataset contains (human) lumbar spine magnetic resonance images (MRI) and segmentation masks described in the following paper:
- van der Graaf, J.W., van Hooff, M.L., Buckens, C.F.M. et al. *Lumbar spine segmentation in MR images: a dataset and a public benchmark.*
Sci Data 11, 264 (2024). https://doi.org/10.1038/s41597-024-03090-w
Original data are available on [Zenodo](https://zenodo.org/records/10159290). More information can be found at [SPIDER Grand Challenge](https://spider.grand-challenge.org/).
<figure>
<img src="docs/ex1.png" alt="Example MRI Image" style="height:300px;">
<figcaption>Example MRI scan (at three different depths)</figcaption>
</figure>
<figure>
<img src="docs/ex2.png" alt="Example MRI Image with Segmentation Mask" style="height:300px;">
<figcaption>Example MRI scan with segmentation masks</figcaption>
</figure>
# Dataset Description
- **Published Paper:** [Lumbar spine segmentation in MR images: a dataset and a public benchmark](https://www.nature.com/articles/s41597-024-03090-w)
- **ArXiv Link:** https://arxiv.org/abs/2306.12217
- **Repository:** [Zenodo](https://zenodo.org/records/8009680)
- **Grand Challenge:** [SPIDER Grand Challenge](https://spider.grand-challenge.org/)
# Tutorials
In addition to the information in this README, several detailed tutorials are provided in the [tutorials](tutorials) folder:
1. [Loading the Dataset](tutorials/load_data.ipynb)
2. [Applying the U-Net Image Segmentation Model to SPIDER](tutorials/UNet_SPIDER.ipynb)
<br>
# Table of Contents (TOC)
1. [Getting Started](https://huggingface.co/datasets/cdoswald/SPIDER#getting-started)
2. [Dataset Summary](https://huggingface.co/datasets/cdoswald/SPIDER#dataset-summary)
3. [Data Modifications](https://huggingface.co/datasets/cdoswald/SPIDER#data-modifications)
4. [Dataset Structure](https://huggingface.co/datasets/cdoswald/SPIDER#dataset-structure)
- [Data Instances](https://huggingface.co/datasets/cdoswald/SPIDER#data-instances)
- [Data Schema](https://huggingface.co/datasets/cdoswald/SPIDER#data-schema)
- [Data Splits](https://huggingface.co/datasets/cdoswald/SPIDER#data-splits)
5. [Image Resolution](https://huggingface.co/datasets/cdoswald/SPIDER#image-resolution)
6. [Additional Information](https://huggingface.co/datasets/cdoswald/SPIDER#additional-information)
- [License](https://huggingface.co/datasets/cdoswald/SPIDER#license)
- [Citation](https://huggingface.co/datasets/cdoswald/SPIDER#citation)
- [Disclaimer](https://huggingface.co/datasets/cdoswald/SPIDER#disclaimer)
- [Known Issues/Bugs](https://huggingface.co/datasets/cdoswald/SPIDER#known-issuesbugs)
<br>
# Getting Started
First, you will need to install the following dependencies:
* `datasets >= 2.18.0`
* `scikit-image >= 0.19.3`
* `SimpleITK >= 2.3.1`
Then you can load the SPIDER dataset as follows:
```python
from datasets import load_dataset
dataset = load_dataset("cdoswald/SPIDER, name="default", trust_remote_code=True)
```
See the [Loading the Dataset](tutorials/load_data.ipynb) tutorial for more information.
# Dataset Summary
The dataset includes 447 sagittal T1 and T2 MRI series collected from 218 patients across four hospitals.
Segmentation masks indicating the vertebrae, intervertebral discs (IVDs), and spinal canal are also included.
Segmentation masks were created manually by a medical trainee under the supervision of a medical imaging expert and an experienced musculoskeletal radiologist.
In addition to MR images and segmentation masks, additional metadata (e.g., scanner manufacturer, pixel bandwidth, etc.), limited
patient characteristics (biological sex and age, when available), and radiological gradings indicating specific degenerative
changes can be loaded with the corresponding image data.
# Data Modifications
This version of the SPIDER dataset (i.e., available through the HuggingFace `datasets` library) differs from the original
data available on [Zenodo](https://zenodo.org/records/8009680) in two key ways:
1. Image Rescaling/Resizing: The original 3D volumetric MRI data (images and masks) are stored as .mha files and do not have a standardized height, width, depth, and image resolution.
To enable the data to be loaded through the HuggingFace `datasets` library, all 447 MRI series and masks are standardized to have size `(512, 512, 30)` and resolution `[0, 255]` (unisgned 8-bit integers); therefore,
n-dimensional interpolation is used to resize and/or rescale the images (via the `skimage.transform.resize` and `skimage.img_as_ubyte` functions).
If you need a different standardization, you have two options:
i. Pass your preferred standardization size as a `Tuple[int, int, int]` to the `resize_shape` argument in `load_dataset` (see the [LoadData Tutorial](placeholder)); OR
ii. After loading the dataset from HuggingFace, use the `SimpleITK` library to import each image using the file path of the locally cached .mha file.
The local cache file path is provided for each example when iterating over the dataset (again, see the [LoadData Tutorial](placeholder)).
2. Train, Validation, and Test Set: The original dataset contained 257 unique studies (i.e., patients) that were partitioned into 218 (85%) studies for the public training/validation set
and 39 (15%) studies for the SPIDER Grand Challenge [hidden test set](https://spider.grand-challenge.org/data/). To enable users to train, validate, and test their models prior to submitting
their models to the SPIDER Grand Challenge, the original 218 studies that comprised the public training/validation set were further partitioned using a 60%/20%/20% split. The original split
for each study (i.e., training or validation set) is recorded in the `OrigSubset` variable in the study's linked metadata.
# Dataset Structure
### Data Instances
There are 447 images and corresponding segmentation masks for 218 unique patients.
### Data Schema
The format for each generated data instance is as follows:
1. **patient_id**: a unique ID number indicating the specific patient (note that many patients have more than one scan in the data)
2. **scan_type**: an indicator for whether the image is a T1-weighted, T2-weighted, or T2-SPACE MRI
3. **image**: a 3-dimensional volumetric array (height, width, depth) of values indicating pixel intensities of MRI scan
4. **mask**: a 3-dimensional volumetric array (height, width, depth) of values indicating the following segmented anatomical feature(s):
- 0 = background
- 1-25 = vertebrae (numbered from the bottom, i.e., L5 = 1)
- 100 = spinal canal
- 101-125 = partially visible vertebrae
- 201-225 = intervertebral discs (numbered from the bottom, i.e., L5/S1 = 201)
See the [SPIDER Grand Challenge](https://grand-challenge.org/algorithms/spider-baseline-iis/) documentation for more details.
6. **image_path**: path to the local cache containing the original (non-rescaled and non-resized) MRI image
7. **mask_path**: path to the local cache containing the original (non-rescaled and non-resized) segementation mask
8. **metadata**: a dictionary of metadata of image, patient, and scanner characteristics:
- number of vertebrae
- number of discs
- biological sex
- age
- manufacturer
- manufacturer model name
- serial number
- software version
- echo numbers
- echo time
- echo train length
- flip angle
- imaged nucleus
- imaging frequency
- inplane phase encoding direction
- MR acquisition type
- magnetic field strength
- number of phase encoding steps
- percent phase field of view
- percent sampling
- photometric interpretation
- pixel bandwidth
- pixel spacing
- repetition time
- specific absorption rate (SAR)
- samples per pixel
- scanning sequence
- sequence name
- series description
- slice thickness
- spacing between slices
- specific character set
- transmit coil name
- window center
- window width
9. **rad_gradings**: radiological gradings by an expert musculoskeletal radiologist indicating specific degenerative
changes at all intervertebral disc (IVD) levels (see page 3 of the [original paper](https://www.nature.com/articles/s41597-024-03090-w)
for more details). The data are provided as a dictionary of lists; an element's position in the list indicates the IVD level. Some elements
are ratings while others are binary indicators. For consistency, each list will have 10 elements, but some IVD levels may not be applicable
to every image (which will be indicated with an empty string).
### Data Splits
The dataset is split as follows:
- Training set:
- 149 unique patients
- 304 total images
- Sagittal T1: 133 images
- Sagittal T2: 145 images
- Sagittal T2-SPACE: 26 images
- Validation set:
- 37 unique patients
- 75 total images
- Sagittal T1: 34 images
- Sagittal T2: 34 images
- Sagittal T2-SPACE: 7 images
- Test set:
- 32 unique patients
- 68 total images
- Sagittal T1: 29 images
- Sagittal T2: 31 images
- Sagittal T2-SPACE: 8 images
An additional hidden test set provided by the paper authors
(i.e., not available via HuggingFace) is available on the
[SPIDER Grand Challenge](https://spider.grand-challenge.org/spiders-challenge/).
# Image Resolution
> Standard sagittal T1 and T2 image resolution ranges from 3.3 x 0.33 x 0.33 mm to 4.8 x 0.90 x 0.90 mm.
> Sagittal T2 SPACE sequence images had a near isotropic spatial resolution with a voxel size of 0.90 x 0.47 x 0.47 mm.
> (https://spider.grand-challenge.org/data/)
Note that all images are rescaled to have pixel intensities in the range `[0, 255]` (i.e., unsigned 8-bit integers)
for compatibility with the HuggingFace `datasets` library. If you want to use the original resolution, you can
load the original images from the local cache indicated in each example's `image_path` and `mask_path` features.
See the [tutorial](tutorials/load_data.ipynb) for more information.
# Additional Information
### License
The dataset is published under a CC-BY 4.0 license: https://creativecommons.org/licenses/by/4.0/legalcode.
### Citation
- van der Graaf, J.W., van Hooff, M.L., Buckens, C.F.M. et al. Lumbar spine segmentation in MR images: a dataset and a public benchmark. Sci Data 11, 264 (2024). https://doi.org/10.1038/s41597-024-03090-w.
### Disclaimer
I am not affiliated in any way with the aforementioned paper, researchers, or organizations. Please validate any findings using this curated dataset
against the original data provided by the researchers on [Zenodo](https://zenodo.org/records/10159290).
### Known Issues/Bugs
1. Serializing data into Apache Arrow format is required to make the dataset available via HuggingFace's `datasets` library. However, it introduces some segmentation
mask integer values that do not map exactly to a defined [anatomical feature category](https://grand-challenge.org/algorithms/spider-baseline-iis/).
See the data loading [tutorial](tutorials/load_data.ipynb) for more information and temporary work-arounds. |
autoevaluate/autoeval-eval-futin__feed-top_en-246167-2175069946 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: futin/feed
dataset_config: top_en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: futin/feed
* Config: top_en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
zolak/twitter_dataset_1713012404 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2688534
num_examples: 6537
download_size: 1350245
dataset_size: 2688534
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TUKE-DeutscheTelekom/squad-sk | ---
annotations_creators:
- crowdsourced
language:
- sk
language_creators:
- crowdsourced
- found
license:
- cc-by-sa-4.0
- cc-by-4.0
multilinguality:
- monolingual
paperswithcode_id: squad-sk
pretty_name: squad-sk
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- wikipedia
task_categories:
- question-answering
- text-retrieval
task_ids:
- open-domain-qa
- extractive-qa
- document-retrieval
train-eval-index:
- col_mapping:
answers:
answer_start: answer_start
text: text
context: context
question: question
config: squad_v2
metrics:
- name: SQuAD v2
type: squad_v2
splits:
eval_split: validation
train_split: train
task: question-answering
task_id: extractive_question_answering
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
diltdicker/romance_books_32K | ---
license: openrail
---
Dataset Summary
---
Collection of Romance Novels featuring `title`, `description`, and `genres`. Created with intention of building a "Romance Novel Generator."
Data Fields
---
- `id` : unique integer to id book in the dataset
- `pub_month` : string indicating the month the book was published in the form: `YEAR_MONTH`
- `title` : title of the book
- `author` : comma-separated (`last-name, first-name`) of the author of book
- `isbn13` : 13 digit number for the isbn of book (note not all books will have an isbn number)
- `description` : text description of the book. May contain quoted lines, a brief teaser of the plot, etc...
- `genres` : dictionary of all genres with 0 indicating the book is **NOT** tagged to that genre, and a 1 indicating that the book is tagged to that genre
- additional fields are the all the individual genres exploded with respective 1 & 0 values
Languages
--
- en |
open-llm-leaderboard/details_KatyTheCutie__LemonadeRP-4.5.3 | ---
pretty_name: Evaluation run of KatyTheCutie/LemonadeRP-4.5.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KatyTheCutie__LemonadeRP-4.5.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T04:45:15.527508](https://huggingface.co/datasets/open-llm-leaderboard/details_KatyTheCutie__LemonadeRP-4.5.3/blob/main/results_2024-03-11T04-45-15.527508.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6429443683338827,\n\
\ \"acc_stderr\": 0.032303252195821835,\n \"acc_norm\": 0.6474291071197706,\n\
\ \"acc_norm_stderr\": 0.03294373721786445,\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.5786643978830561,\n\
\ \"mc2_stderr\": 0.015355347002708696\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000326,\n\
\ \"acc_norm\": 0.6510238907849829,\n \"acc_norm_stderr\": 0.013928933461382501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6577375024895439,\n\
\ \"acc_stderr\": 0.0047349726682996175,\n \"acc_norm\": 0.8472415853415655,\n\
\ \"acc_norm_stderr\": 0.003590192371969654\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n\
\ \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n\
\ \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n\
\ \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n\
\ \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"\
acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371805,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371805\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.015937484656687026,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.015937484656687026\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.01274823839736555,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.01274823839736555\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482708,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482708\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070803,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070803\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.5786643978830561,\n\
\ \"mc2_stderr\": 0.015355347002708696\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712664\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4632297194844579,\n \
\ \"acc_stderr\": 0.013735191956468648\n }\n}\n```"
repo_url: https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|arc:challenge|25_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|gsm8k|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hellaswag|10_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-45-15.527508.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T04-45-15.527508.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- '**/details_harness|winogrande|5_2024-03-11T04-45-15.527508.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T04-45-15.527508.parquet'
- config_name: results
data_files:
- split: 2024_03_11T04_45_15.527508
path:
- results_2024-03-11T04-45-15.527508.parquet
- split: latest
path:
- results_2024-03-11T04-45-15.527508.parquet
---
# Dataset Card for Evaluation run of KatyTheCutie/LemonadeRP-4.5.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KatyTheCutie__LemonadeRP-4.5.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T04:45:15.527508](https://huggingface.co/datasets/open-llm-leaderboard/details_KatyTheCutie__LemonadeRP-4.5.3/blob/main/results_2024-03-11T04-45-15.527508.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6429443683338827,
"acc_stderr": 0.032303252195821835,
"acc_norm": 0.6474291071197706,
"acc_norm_stderr": 0.03294373721786445,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.5786643978830561,
"mc2_stderr": 0.015355347002708696
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000326,
"acc_norm": 0.6510238907849829,
"acc_norm_stderr": 0.013928933461382501
},
"harness|hellaswag|10": {
"acc": 0.6577375024895439,
"acc_stderr": 0.0047349726682996175,
"acc_norm": 0.8472415853415655,
"acc_norm_stderr": 0.003590192371969654
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371805,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371805
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.015937484656687026,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.015937484656687026
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.01274823839736555,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.01274823839736555
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482708,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482708
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070803,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070803
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.5786643978830561,
"mc2_stderr": 0.015355347002708696
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712664
},
"harness|gsm8k|5": {
"acc": 0.4632297194844579,
"acc_stderr": 0.013735191956468648
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zhaizy/test | ---
dataset_info:
features:
- name: original_text
dtype: string
- name: rewrite_prompt
dtype: string
- name: rewritten_text
dtype: string
splits:
- name: train
num_bytes: 2902
num_examples: 5
download_size: 6358
dataset_size: 2902
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LahiruLowe/t0_explanation_targets_mosaicml-mpt-7b-8k-instruct | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 116123
num_examples: 77
download_size: 51066
dataset_size: 116123
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "t0_explanation_targets_mosaicml-mpt-7b-8k-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GeraValTec/DroughtPrediction_Whole | ---
license: apache-2.0
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: uint8
- name: feat_static_cat
dtype: uint64
- name: feat_dynamic_real
dtype: float32
- name: item_id
dtype: string
splits:
- name: train
num_bytes: 25227475.333333332
num_examples: 394
- name: test
num_bytes: 25227475.333333332
num_examples: 394
- name: validation
num_bytes: 25227475.333333332
num_examples: 394
download_size: 3603752
dataset_size: 75682426.0
---
|
HydraLM/partitioned_v2_split | ---
configs:
- config_name: default
data_files:
- split: '0'
path: data/0-*
- split: '1'
path: data/1-*
- split: '2'
path: data/2-*
- split: '3'
path: data/3-*
- split: '4'
path: data/4-*
- split: '5'
path: data/5-*
- split: '6'
path: data/6-*
- split: '7'
path: data/7-*
- split: '8'
path: data/8-*
- split: '9'
path: data/9-*
- split: '10'
path: data/10-*
- split: '11'
path: data/11-*
- split: '12'
path: data/12-*
- split: '13'
path: data/13-*
- split: '14'
path: data/14-*
- split: '15'
path: data/15-*
dataset_info:
features:
- name: conversations
list:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: cluster_text
dtype: string
- name: embedding
sequence: float64
- name: cluster
dtype: int64
- name: unique_id
dtype: string
splits:
- name: '0'
num_bytes: 779602139
num_examples: 57463
- name: '1'
num_bytes: 716142691
num_examples: 47816
- name: '2'
num_bytes: 376723531
num_examples: 43276
- name: '3'
num_bytes: 271125675
num_examples: 37872
- name: '4'
num_bytes: 334527340
num_examples: 42303
- name: '5'
num_bytes: 428843979
num_examples: 44084
- name: '6'
num_bytes: 285189781
num_examples: 39017
- name: '7'
num_bytes: 350378889
num_examples: 30775
- name: '8'
num_bytes: 261834062
num_examples: 33594
- name: '9'
num_bytes: 165750034
num_examples: 19440
- name: '10'
num_bytes: 137592285
num_examples: 11770
- name: '11'
num_bytes: 688937855
num_examples: 69955
- name: '12'
num_bytes: 239948606
num_examples: 22717
- name: '13'
num_bytes: 377427901
num_examples: 50626
- name: '14'
num_bytes: 343568172
num_examples: 41822
- name: '15'
num_bytes: 711665879
num_examples: 79575
download_size: 4399745966
dataset_size: 6469258819
---
# Dataset Card for "partitioned_v2_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zefang-liu/secqa | ---
license: cc-by-nc-sa-4.0
task_categories:
- multiple-choice
language:
- en
tags:
- security
size_categories:
- n<1K
configs:
- config_name: secqa_v1
data_files:
- split: dev
path: "data/secqa_v1_dev.csv"
- split: val
path: "data/secqa_v1_val.csv"
- split: test
path: "data/secqa_v1_test.csv"
- config_name: secqa_v2
data_files:
- split: dev
path: "data/secqa_v2_dev.csv"
- split: val
path: "data/secqa_v2_val.csv"
- split: test
path: "data/secqa_v2_test.csv"
---
# SecQA
<!-- Provide a quick summary of the dataset. -->
SecQA is a specialized dataset created for the evaluation of Large Language Models (LLMs) in the domain of computer security.
It consists of multiple-choice questions, generated using GPT-4 and the
[Computer Systems Security: Planning for Success](https://web.njit.edu/~rt494/security/) textbook,
aimed at assessing the understanding and application of LLMs' knowledge in computer security.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
SecQA is an innovative dataset designed to benchmark the performance of Large Language Models (LLMs) in the field of computer security.
It contains a series of multiple-choice questions generated by GPT-4, based on the content from the textbook
[Computer Systems Security: Planning for Success](https://web.njit.edu/~rt494/security/).
The dataset is structured into two versions, v1 and v2, with v2 presenting a higher level of difficulty.
This design allows for a preliminary evaluation of LLMs across different levels of complexity
in understanding and applying computer security principles.
The dataset aims to provide a unique resource for researchers and developers to gauge the capabilities of LLMs
in this domain that is critical to modern digital infrastructures.
- **Curated by:** [Zefang Liu](https://www.linkedin.com/in/zefang-liu/)
- **Language(s) (NLP):** English
- **License:** [CC BY-NC-SA 4.0 DEED](https://creativecommons.org/licenses/by-nc-sa/4.0/)
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [SecQA](https://huggingface.co/datasets/zefang-liu/secqa)
- **Book:** [Computer Systems Security: Planning for Success](https://web.njit.edu/~rt494/security/)
- **Paper:** [SecQA: A Concise Question-Answering Dataset for Evaluating Large Language Models in Computer Security](https://arxiv.org/abs/2312.15838)
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The primary application of SecQA is to serve as a benchmark for testing and evaluating
the capabilities of LLMs in the domain of computer security.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
The SecQA dataset is primarily intended for evaluating and benchmarking the performance of Large Language Models (LLMs)
in understanding and applying principles of computer security.
It's suitable for academic research, development of AI in cybersecurity education,
and testing the ability of models to interpret and respond to security-related scenarios.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
SecQA is not designed for and should not be used as a sole resource for real-world cybersecurity decision-making or incident response.
Its use is also inappropriate for training models for unethical purposes, such as hacking or creating security exploits.
Additionally, the dataset should not be considered comprehensive for all aspects of computer security,
and thus, it's not suitable for scenarios requiring broad or up-to-date industry knowledge.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
SecQA is structured into two versions, v1 and v2. Version 1 (v1) serves as the foundational level,
while version 2 (v2) presents a more advanced challenge, catering to a higher degree of difficulty in the questions posed.
Each version is composed of multiple-choice questions that are closely aligned with different learning objectives
within the field of computer security.
Each question in the dataset offers four answer choices, with only one being the correct answer.
To ensure fairness and eliminate any bias in question design, the answer choices have been carefully shuffled.
This shuffling not only contributes to a balanced distribution of answers
but also enhances the dataset’s effectiveness in evaluating the nuanced understanding and reasoning capabilities
of Large Language Models in computer security scenarios.
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The dataset was created to fill a gap in assessing the understanding and application of computer security concepts by LLMs.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
The questions were generated by GPT-4, leveraging content from the textbook "Computer Systems Security: Planning for Success"
under the guidance of researchers.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
The source data is produced by a collaboration between GPT-4 and researchers, utilizing the aforementioned textbook.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The SecQA dataset, though valuable for evaluating LLMs in computer security,
has limitations due to potential content biases from its source material and GPT-4 processing,
a narrow focus on computer security that may not extend to broader cybersecurity contexts,
and varying levels of difficulty across versions that could affect model assessment fairness.
Additionally, the shuffling of answer choices, while promoting balance, might introduce patterns exploitable by sophisticated models.
Given the rapid evolution of the field, some aspects of the dataset may quickly become outdated,
and there is a risk of misuse for purposes like security manipulation.
These factors should be carefully considered in research and application contexts.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@misc{liu2023secqa,
title={SecQA: A Concise Question-Answering Dataset for Evaluating Large Language Models in Computer Security},
author={Zefang Liu},
year={2023},
eprint={2312.15838},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
**APA:**
Zefang Liu. (2023). SecQA: A Concise Question-Answering Dataset for Evaluating Large Language Models in Computer Security.
## Dataset Card Contact
For inquiries or further information about the SecQA dataset,
please contact [Zefang Liu](https://www.linkedin.com/in/zefang-liu/). |
wbxlala/eegimage2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 314788629.2
num_examples: 7360
- name: test
num_bytes: 98370684.0
num_examples: 2300
download_size: 414779791
dataset_size: 413159313.2
---
# Dataset Card for "eegimage2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Erynan/100_deon_util_shuffled | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 73066
num_examples: 100
download_size: 17853
dataset_size: 73066
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Blablablab/SOCKET | ---
license: cc-by-4.0
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository: https://github.com/minjechoi/SOCKET
- **Paper: Do LLMs Understand Social Knowledge? Evaluating the Sociability of Large Language Models with SocKET Benchmark [link](https://arxiv.org/abs/2305.14938)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This Dataset contains the tasks used in the paper "Do LLMs Understand Social Knowledge? Evaluating the Sociability of Large Language Models with SocKET Benchmark" [link](https://arxiv.org/abs/2305.14938).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
This benchmark is created by aggregating several existing NLP datasets that measure different aspects of social information.
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
@misc{choi2023llms,
title={Do LLMs Understand Social Knowledge? Evaluating the Sociability of Large Language Models with SocKET Benchmark},
author={Minje Choi and Jiaxin Pei and Sagar Kumar and Chang Shu and David Jurgens},
year={2023},
eprint={2305.14938},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
### Contributions
[More Information Needed] |
anan-2024/twitter_dataset_1713178758 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 16335
num_examples: 37
download_size: 9918
dataset_size: 16335
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/autotree_automl_covertype_gosdt_l512_d3_sd3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2014669554
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_covertype_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnkushJindal28/ncbi-disease | ---
license: apache-2.0
---
|
llama-anon/petratest | ---
license: agpl-3.0
---
|
zolak/twitter_dataset_50_1713178785 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 300250
num_examples: 755
download_size: 154346
dataset_size: 300250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
parler-tts/mls_eng_10k | ---
pretty_name: 10K hours of English MLS
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- multilingual
paperswithcode_id: multilingual-librispeech
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- text-to-speech
- text-to-audio
dataset_info:
features:
- name: audio
dtype: audio
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: transcript
dtype: string
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
splits:
- name: dev
num_bytes: 249691299.74
num_examples: 3807
- name: test
num_bytes: 245941162.096
num_examples: 3769
- name: train
num_bytes: 158437701688.205
num_examples: 2420047
download_size: 158461062068
dataset_size: 158933334150.041
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
### Dataset Summary
This is a **10K hours** subset of **[English version of the Multilingual LibriSpeech (MLS) dataset](https://huggingface.co/datasets/parler-tts/mls_eng)**.
The data archives were restructured from the original ones from [OpenSLR](http://www.openslr.org/94) to make it easier to stream.
MLS dataset is a large multilingual corpus suitable for speech research. The dataset is derived from read audiobooks from LibriVox and consists of
8 languages - English, German, Dutch, Spanish, French, Italian, Portuguese, Polish. It includes about 44.5K hours of English and a total of about 6K hours for other languages.
This dataset card includes the 10K hours of English. Refers to this [dataset card](https://huggingface.co/datasets/facebook/multilingual_librispeech) for the other languages.
### Licensing Information
Public Domain, Creative Commons Attribution 4.0 International Public License ([CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode))
### Citation Information
```
@article{Pratap2020MLSAL,
title={MLS: A Large-Scale Multilingual Dataset for Speech Research},
author={Vineel Pratap and Qiantong Xu and Anuroop Sriram and Gabriel Synnaeve and Ronan Collobert},
journal={ArXiv},
year={2020},
volume={abs/2012.03411}
}
``` |
ylacombe/libritts_r_tags_tagged_10k | ---
dataset_info:
- config_name: clean
features:
- name: text
dtype: string
- name: text_original
dtype: string
- name: speaker_id
dtype: string
- name: path
dtype: string
- name: chapter_id
dtype: string
- name: id
dtype: string
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: snr
dtype: float32
- name: c50
dtype: float32
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
splits:
- name: clean
num_bytes: 3395048
num_examples: 4837
- name: '100'
num_bytes: 22451705
num_examples: 33232
- name: '360'
num_bytes: 79753887
num_examples: 116426
download_size: 39182876
dataset_size: 105600640
- config_name: other
features:
- name: text
dtype: string
- name: text_original
dtype: string
- name: speaker_id
dtype: string
- name: path
dtype: string
- name: chapter_id
dtype: string
- name: id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
splits:
- name: other
num_bytes: 3218840
num_examples: 5120
- name: '500'
num_bytes: 136464068
num_examples: 205035
download_size: 50544631
dataset_size: 139682908
configs:
- config_name: clean
data_files:
- split: clean
path: clean/clean-*
- split: '100'
path: clean/100-*
- split: '360'
path: clean/360-*
- config_name: other
data_files:
- split: other
path: other/other-*
- split: '500'
path: other/500-*
---
|
ahmadSiddiqi/amazon_massive_intent_fr | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': datetime_query
'1': iot_hue_lightchange
'2': transport_ticket
'3': takeaway_query
'4': qa_stock
'5': general_greet
'6': recommendation_events
'7': music_dislikeness
'8': iot_wemo_off
'9': cooking_recipe
'10': qa_currency
'11': transport_traffic
'12': general_quirky
'13': weather_query
'14': audio_volume_up
'15': email_addcontact
'16': takeaway_order
'17': email_querycontact
'18': iot_hue_lightup
'19': recommendation_locations
'20': play_audiobook
'21': lists_createoradd
'22': news_query
'23': alarm_query
'24': iot_wemo_on
'25': general_joke
'26': qa_definition
'27': social_query
'28': music_settings
'29': audio_volume_other
'30': calendar_remove
'31': iot_hue_lightdim
'32': calendar_query
'33': email_sendemail
'34': iot_cleaning
'35': audio_volume_down
'36': play_radio
'37': cooking_query
'38': datetime_convert
'39': qa_maths
'40': iot_hue_lightoff
'41': iot_hue_lighton
'42': transport_query
'43': music_likeness
'44': email_query
'45': play_music
'46': audio_volume_mute
'47': social_post
'48': alarm_set
'49': qa_factoid
'50': calendar_set
'51': play_game
'52': alarm_remove
'53': lists_remove
'54': transport_taxi
'55': recommendation_movies
'56': iot_coffee
'57': music_query
'58': play_podcasts
'59': lists_query
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 839181
num_examples: 11514
- name: validation
num_bytes: 146928
num_examples: 2033
download_size: 380377
dataset_size: 986109
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
UCLA-AGI/SPIN_iter3 | ---
license: apache-2.0
dataset_info:
features:
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 215987768
num_examples: 49792
- name: test
num_bytes: 2164394
num_examples: 500
download_size: 120703241
dataset_size: 218152162
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_vicgalle__ConfigurableBeagle-11B | ---
pretty_name: Evaluation run of vicgalle/ConfigurableBeagle-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vicgalle/ConfigurableBeagle-11B](https://huggingface.co/vicgalle/ConfigurableBeagle-11B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__ConfigurableBeagle-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T20:07:53.790814](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__ConfigurableBeagle-11B/blob/main/results_2024-02-17T20-07-53.790814.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6708789898574004,\n\
\ \"acc_stderr\": 0.03158856216976448,\n \"acc_norm\": 0.6718653307631027,\n\
\ \"acc_norm_stderr\": 0.03223060120708135,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7712685242373997,\n\
\ \"mc2_stderr\": 0.013836184817525006\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520766,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7155945030870344,\n\
\ \"acc_stderr\": 0.004502088287470138,\n \"acc_norm\": 0.8884684325831508,\n\
\ \"acc_norm_stderr\": 0.0031414591751392704\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.036117805602848975,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.036117805602848975\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933715,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5079365079365079,\n \"acc_stderr\": 0.025748065871673297,\n \"\
acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.025748065871673297\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \
\ \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608302,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608302\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168585,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807896,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807896\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768424,\n \
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768424\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941635,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941635\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4849162011173184,\n\
\ \"acc_stderr\": 0.01671489037999606,\n \"acc_norm\": 0.4849162011173184,\n\
\ \"acc_norm_stderr\": 0.01671489037999606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5602836879432624,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5280312907431551,\n\
\ \"acc_stderr\": 0.012750151802922447,\n \"acc_norm\": 0.5280312907431551,\n\
\ \"acc_norm_stderr\": 0.012750151802922447\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.027365861131513812,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.027365861131513812\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6928104575163399,\n \"acc_stderr\": 0.018663359671463663,\n \
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.018663359671463663\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.026537045312145277,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.026537045312145277\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7712685242373997,\n\
\ \"mc2_stderr\": 0.013836184817525006\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828077\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6391205458680819,\n \
\ \"acc_stderr\": 0.013228626753925152\n }\n}\n```"
repo_url: https://huggingface.co/vicgalle/ConfigurableBeagle-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|arc:challenge|25_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|gsm8k|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hellaswag|10_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T20-07-53.790814.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T20-07-53.790814.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- '**/details_harness|winogrande|5_2024-02-17T20-07-53.790814.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T20-07-53.790814.parquet'
- config_name: results
data_files:
- split: 2024_02_17T20_07_53.790814
path:
- results_2024-02-17T20-07-53.790814.parquet
- split: latest
path:
- results_2024-02-17T20-07-53.790814.parquet
---
# Dataset Card for Evaluation run of vicgalle/ConfigurableBeagle-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalle/ConfigurableBeagle-11B](https://huggingface.co/vicgalle/ConfigurableBeagle-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__ConfigurableBeagle-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T20:07:53.790814](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__ConfigurableBeagle-11B/blob/main/results_2024-02-17T20-07-53.790814.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6708789898574004,
"acc_stderr": 0.03158856216976448,
"acc_norm": 0.6718653307631027,
"acc_norm_stderr": 0.03223060120708135,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7712685242373997,
"mc2_stderr": 0.013836184817525006
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520766,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7155945030870344,
"acc_stderr": 0.004502088287470138,
"acc_norm": 0.8884684325831508,
"acc_norm_stderr": 0.0031414591751392704
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.036117805602848975,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.036117805602848975
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933715,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.025748065871673297,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.025748065871673297
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608302,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.015014462497168585,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.015014462497168585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807896,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807896
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768424,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768424
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941635,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941635
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4849162011173184,
"acc_stderr": 0.01671489037999606,
"acc_norm": 0.4849162011173184,
"acc_norm_stderr": 0.01671489037999606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5280312907431551,
"acc_stderr": 0.012750151802922447,
"acc_norm": 0.5280312907431551,
"acc_norm_stderr": 0.012750151802922447
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.027365861131513812,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.027365861131513812
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.018663359671463663,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.018663359671463663
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.026537045312145277,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.026537045312145277
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7712685242373997,
"mc2_stderr": 0.013836184817525006
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828077
},
"harness|gsm8k|5": {
"acc": 0.6391205458680819,
"acc_stderr": 0.013228626753925152
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/maxwell_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of maxwell/マクスウェル/麦斯威尔/맥스웰 (Nikke: Goddess of Victory)
This is the dataset of maxwell/マクスウェル/麦斯威尔/맥스웰 (Nikke: Goddess of Victory), containing 59 images and their tags.
The core tags of this character are `breasts, blonde_hair, blue_eyes, bangs, large_breasts, sidelocks, short_hair, hair_between_eyes, hat, bandaid_on_face, visor_cap`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 59 | 80.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maxwell_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 59 | 43.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maxwell_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 141 | 94.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maxwell_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 59 | 69.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maxwell_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 141 | 136.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maxwell_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/maxwell_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 44 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, cleavage, navel, bare_shoulders, blush, collarbone, open_mouth, bandaid, black_choker, headset, piercing, simple_background, stomach, armband, black_gloves, earrings, upper_body, white_bikini |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | cleavage | navel | bare_shoulders | blush | collarbone | open_mouth | bandaid | black_choker | headset | piercing | simple_background | stomach | armband | black_gloves | earrings | upper_body | white_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-----------|:--------|:-----------------|:--------|:-------------|:-------------|:----------|:---------------|:----------|:-----------|:--------------------|:----------|:----------|:---------------|:-----------|:-------------|:---------------|
| 0 | 44 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
TIGER-Lab/SKGInstruct-skg-only | ---
license: cc-by-nc-2.0
task_categories:
- text-generation
language:
- en
pretty_name: SKGInstruct
size_categories:
- 100K<n<1M
tags:
- code
- SKG
---
# 🏗️ StructLM: Towards Building Generalist Models for Structured Knowledge Grounding
SKGInstruct-skg-only is an instruction tuning dataset constructed from 19 structured knowledge grounding datasets.
Project Page: [https://tiger-ai-lab.github.io/StructLM/](https://tiger-ai-lab.github.io/StructLM/)
Paper: [https://arxiv.org/pdf/2402.16671.pdf](https://arxiv.org/pdf/2402.16671.pdf)
Code: [https://github.com/TIGER-AI-Lab/StructLM](https://github.com/TIGER-AI-Lab/StructLM)
Models:
7B | [StructLM-7B](https://huggingface.co/TIGER-Lab/StructLM-7B)
13B | [StructLM-13B](https://huggingface.co/TIGER-Lab/StructLM-13B)
34B | [StructLM-34B](https://huggingface.co/TIGER-Lab/StructLM-34B)
## **License**
| Dataset Name | License Type |
|--------------|----------------|
| TabMWP | [Attribution-ShareAlike 4.0 International](https://creativecommons.org/licenses/by-sa/4.0/)|
| everything else | [Attribution-NonCommercial-ShareAlike 4.0 International](https://creativecommons.org/licenses/by-nc-sa/4.0/)|
## **Citation**
Please cite our paper if you use our data, model or code. Please also kindly cite the original dataset papers.
```
@misc{zhuang2024structlm,
title={StructLM: Towards Building Generalist Models for Structured Knowledge Grounding},
author={Alex Zhuang and Ge Zhang and Tianyu Zheng and Xinrun Du and Junjie Wang and Weiming Ren and Stephen W. Huang and Jie Fu and Xiang Yue and Wenhu Chen},
year={2024},
eprint={2402.16671},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
macadeliccc/truthy-dpo-v0.1-orca-format | ---
language:
- en
dataset_info:
features:
- name: id
dtype: string
- name: source
dtype: string
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1344072
num_examples: 1016
download_size: 652993
dataset_size: 1344072
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "truthy-dpo-v0.1-orca-format"
credit to jondurbin for truthy-dpo-v0.1. This is a reupload of his dataset with a different column name to streamline my finetuning process.
If you use this dataset please cite jondurbin |
seanghay/service.gov.kh | ---
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: ministry_id
dtype: int64
- name: body
dtype: string
splits:
- name: train
num_bytes: 17904634
num_examples: 557
download_size: 3686057
dataset_size: 17904634
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cjvt/janes_preklop | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: language
sequence: string
splits:
- name: train
num_bytes: 410822
num_examples: 1104
download_size: 623816
dataset_size: 412672
task_categories:
- token-classification
language:
- sl
tags:
- tweets
- code-mixing
- code-switching
size_categories:
- 1K<n<10K
---
# Dataset Card for Janes-Preklop
### Dataset Summary
Janes-Preklop is a corpus of Slovene tweets that is manually annotated for code-switching: the use of words from two
or more languages within one sentence or utterance.
### Languages
Code-switched Slovenian.
## Dataset Structure
### Data Instances
A sample instance from the dataset - each word is annotated with its language, either `"default"`
(Slovenian/unclassifiable), `en` (English), `de` (German), `hbs` (Serbo-Croatian), `sp` (Spanish),
`la` (Latin), `ar` (Arabic), `fr` (French), `it` (Italian), or `pt` (Portuguese).
```
{
'id': 'tid.397447931558895616',
'words': ['Brad', 'Pitt', 'na', 'Planet', 'TV', '.', 'U', 'are', 'welcome', ';)'],
'language': ['default', 'default', 'default', 'default', 'default', 'default', 'B-en', 'I-en', 'I-en', 'I-en']
}
```
### Data Fields
- `id`: unique identifier of the example;
- `words`: words in the sentence;
- `language`: language of each word.
## Additional Information
### Dataset Curators
Špela Reher, Tomaž Erjavec, Darja Fišer.
### Licensing Information
CC BY-SA 4.0.
### Citation Information
```
@misc{janes_preklop,
title = {Tweet code-switching corpus Janes-Preklop 1.0},
author = {Reher, {\v S}pela and Erjavec, Toma{\v z} and Fi{\v s}er, Darja},
url = {http://hdl.handle.net/11356/1154},
note = {Slovenian language resource repository {CLARIN}.{SI}},
copyright = {Creative Commons - Attribution-{ShareAlike} 4.0 International ({CC} {BY}-{SA} 4.0)},
issn = {2820-4042},
year = {2017}
}
```
### Contributions
Thanks to [@matejklemen](https://github.com/matejklemen) for adding this dataset. |
schooly/cas-chatgpt-prompts | ---
license: mit
---
|
open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV5-7b | ---
pretty_name: Evaluation run of ChaoticNeutrals/Prima-LelantaclesV5-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChaoticNeutrals/Prima-LelantaclesV5-7b](https://huggingface.co/ChaoticNeutrals/Prima-LelantaclesV5-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV5-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T12:18:57.665828](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV5-7b/blob/main/results_2024-02-21T12-18-57.665828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499356405196555,\n\
\ \"acc_stderr\": 0.03220078301623862,\n \"acc_norm\": 0.6501687781412798,\n\
\ \"acc_norm_stderr\": 0.032862169524457766,\n \"mc1\": 0.5165238678090576,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.682552940107254,\n\
\ \"mc2_stderr\": 0.015087198326455812\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.01358257109581529,\n\
\ \"acc_norm\": 0.7064846416382252,\n \"acc_norm_stderr\": 0.013307250444941108\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714797849034057,\n\
\ \"acc_stderr\": 0.004505879084606843,\n \"acc_norm\": 0.8787094204341764,\n\
\ \"acc_norm_stderr\": 0.0032579745937899407\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02390115797940253,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02390115797940253\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662253,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\
\ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\
\ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5165238678090576,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.682552940107254,\n\
\ \"mc2_stderr\": 0.015087198326455812\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6482183472327521,\n \
\ \"acc_stderr\": 0.013153446023536039\n }\n}\n```"
repo_url: https://huggingface.co/ChaoticNeutrals/Prima-LelantaclesV5-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|arc:challenge|25_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|gsm8k|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hellaswag|10_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-18-57.665828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T12-18-57.665828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- '**/details_harness|winogrande|5_2024-02-21T12-18-57.665828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T12-18-57.665828.parquet'
- config_name: results
data_files:
- split: 2024_02_21T12_18_57.665828
path:
- results_2024-02-21T12-18-57.665828.parquet
- split: latest
path:
- results_2024-02-21T12-18-57.665828.parquet
---
# Dataset Card for Evaluation run of ChaoticNeutrals/Prima-LelantaclesV5-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChaoticNeutrals/Prima-LelantaclesV5-7b](https://huggingface.co/ChaoticNeutrals/Prima-LelantaclesV5-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV5-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T12:18:57.665828](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV5-7b/blob/main/results_2024-02-21T12-18-57.665828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499356405196555,
"acc_stderr": 0.03220078301623862,
"acc_norm": 0.6501687781412798,
"acc_norm_stderr": 0.032862169524457766,
"mc1": 0.5165238678090576,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.682552940107254,
"mc2_stderr": 0.015087198326455812
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.01358257109581529,
"acc_norm": 0.7064846416382252,
"acc_norm_stderr": 0.013307250444941108
},
"harness|hellaswag|10": {
"acc": 0.714797849034057,
"acc_stderr": 0.004505879084606843,
"acc_norm": 0.8787094204341764,
"acc_norm_stderr": 0.0032579745937899407
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02390115797940253,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02390115797940253
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662253,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5165238678090576,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.682552940107254,
"mc2_stderr": 0.015087198326455812
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.6482183472327521,
"acc_stderr": 0.013153446023536039
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Kelvin878/magnetic | ---
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 51974634.336
num_examples: 1344
download_size: 51417795
dataset_size: 51974634.336
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
GroNLP/ik-nlp-22_transqe | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
- machine-generated
language:
- en
- nl
license:
- apache-2.0
multilinguality:
- translation
size_categories:
- unknown
source_datasets:
- extended|esnli
task_categories:
- text-classification
task_ids:
- natural-language-inference
pretty_name: iknlp22-transqe
tags:
- quality-estimation
---
# Dataset Card for IK-NLP-22 Project 3: Translation Quality-driven Data Selection for Natural Language Inference
## Table of Contents
- [Dataset Card for IK-NLP-22 Project 3: Translation Quality-driven Data Selection for Natural Language Inference](#dataset-card-for-ik-nlp-22-project-3-translation-quality-driven-data-selection-for-natural-language-inference)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Splits](#data-splits)
- [Data Example](#data-example)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Source:** [Github](https://github.com/OanaMariaCamburu/e-SNLI)
- **Point of Contact:** [Gabriele Sarti](mailto:ik-nlp-course@rug.nl)
### Dataset Summary
This dataset contains the full [e-SNLI](https://huggingface.co/datasets/esnli) dataset, automatically translated to Dutch using the [Helsinki-NLP/opus-mt-en-nl](https://huggingface.co/Helsinki-NLP/opus-mt-en-nl) neural machine translation model. The translation of each field has been anotated with two quality estimation scores using the referenceless version of the [COMET](https://github.com/Unbabel/COMET/) metric by Unbabel.
The intended usage of this corpus is restricted to the scope of final project for the 2022 edition of the Natural Language Processing course at the Information Science Master's Degree (IK) at the University of Groningen, taught by [Arianna Bisazza](https://research.rug.nl/en/persons/arianna-bisazza) and [Gabriele Sarti](https://research.rug.nl/en/persons/gabriele-sarti), with the assistance of [Anjali Nair](https://nl.linkedin.com/in/anjalinair012).
*The e-SNLI corpus was made freely available by the authors on Github. The present dataset was created for educational purposes, and is based on the original e-SNLI dataset by Camburu et al..All rights of the present contents are attributed to the original authors.*
### Languages
The language data of this corpus is in English (BCP-47 `en`) and Dutch (BCP-47 `nl`).
## Dataset Structure
### Data Instances
The dataset contains a single condiguration by default, named `plain_text`, with the three original splits `train`, `validation` and `test`. Every split contains the following fields:
| **Field** | **Description** |
|------------|-----------------------------|
|`premise_en`| The original English premise.|
|`premise_nl`| The premise automatically translated to Dutch.|
|`hypothesis_en`| The original English hypothesis.|
|`hypothesis_nl`| The hypothesis automatically translated to Dutch.|
|`label`| The label of the data instance (0 for entailment, 1 for neutral, 2 for contradiction).|
|`explanation_1_en`| The first explanation for the assigned label in English.|
|`explanation_1_nl`| The first explanation automatically translated to Dutch.|
|`explanation_2_en`| The second explanation for the assigned label in English.|
|`explanation_2_nl`| The second explanation automatically translated to Dutch.|
|`explanation_3_en`| The third explanation for the assigned label in English.|
|`explanation_3_nl`| The third explanation automatically translated to Dutch.|
|`da_premise`| The quality estimation produced by the `wmt20-comet-qe-da` model for the premise translation.|
|`da_hypothesis`| The quality estimation produced by the `wmt20-comet-qe-da` model for the hypothesis translation.|
|`da_explanation_1`| The quality estimation produced by the `wmt20-comet-qe-da` model for the first explanation translation.|
|`da_explanation_2`| The quality estimation produced by the `wmt20-comet-qe-da` model for the second explanation translation.|
|`da_explanation_3`| The quality estimation produced by the `wmt20-comet-qe-da` model for the third explanation translation.|
|`mqm_premise`| The quality estimation produced by the `wmt21-comet-qe-mqm` model for the premise translation.|
|`mqm_hypothesis`| The quality estimation produced by the `wmt21-comet-qe-mqm` model for the hypothesis translation.|
|`mqm_explanation_1`| The quality estimation produced by the `wmt21-comet-qe-mqm` model for the first explanation translation.|
|`mqm_explanation_2`| The quality estimation produced by the `wmt21-comet-qe-mqm` model for the second explanation translation.|
|`mqm_explanation_3`| The quality estimation produced by the `wmt21-comet-qe-mqm` model for the third explanation translation.|
Explanation 2 and 3 and related quality estimation scores are only present in the `validation` and `test` splits.
### Data Splits
| config| train | validation | test |
|------------:|---------|------------|------|
|`plain_text` | 549'367 | 9842 | 9824 |
For your analyses, use the amount of data that is the most reasonable for your computational setup. The more, the better.
### Data Example
The following is an example of entry 2000 taken from the `test` split:
```json
{
"premise_en": "A young woman wearing a yellow sweater and black pants is ice skating outdoors.",
"premise_nl": "Een jonge vrouw met een gele trui en zwarte broek schaatst buiten.",
"hypothesis_en": "a woman is practicing for the olympics",
"hypothesis_nl": "een vrouw oefent voor de Olympische Spelen",
"label": 1,
"explanation_1_en": "You can not infer it's for the Olympics.",
"explanation_1_nl": "Het is niet voor de Olympische Spelen.",
"explanation_2_en": "Just because a girl is skating outdoors does not mean she is practicing for the Olympics.",
"explanation_2_nl": "Alleen omdat een meisje buiten schaatst betekent niet dat ze oefent voor de Olympische Spelen.",
"explanation_3_en": "Ice skating doesn't imply practicing for the olympics.",
"explanation_3_nl": "Schaatsen betekent niet oefenen voor de Olympische Spelen.",
"da_premise": "0.6099",
"mqm_premise": "0.1298",
"da_hypothesis": "0.8504",
"mqm_hypothesis": "0.1521",
"da_explanation_1": "0.0001",
"mqm_explanation_1": "0.1237",
"da_explanation_2": "0.4017",
"mqm_explanation_2": "0.1467",
"da_explanation_3": "0.6069",
"mqm_explanation_3": "0.1389"
}
```
### Dataset Creation
The dataset was created through the following steps:
- Translating every field of the original e-SNLI corpus to Dutch using the [Helsinki-NLP/opus-mt-en-nl](https://huggingface.co/Helsinki-NLP/opus-mt-en-nl) neural machine translation model.
- Annotating the quality estimation of the translations with two referenceless versions of the [COMET](https://github.com/Unbabel/COMET/) metric by Unbabel.
## Additional Information
### Dataset Curators
For problems on this 🤗 Datasets version, please contact us at [ik-nlp-course@rug.nl](mailto:ik-nlp-course@rug.nl).
### Licensing Information
The dataset is licensed under the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0.html).
### Citation Information
Please cite the authors if you use these corpora in your work:
```bibtex
@incollection{NIPS2018_8163,
title = {e-SNLI: Natural Language Inference with Natural Language Explanations},
author = {Camburu, Oana-Maria and Rockt\"{a}schel, Tim and Lukasiewicz, Thomas and Blunsom, Phil},
booktitle = {Advances in Neural Information Processing Systems 31},
editor = {S. Bengio and H. Wallach and H. Larochelle and K. Grauman and N. Cesa-Bianchi and R. Garnett},
pages = {9539--9549},
year = {2018},
publisher = {Curran Associates, Inc.},
url = {http://papers.nips.cc/paper/8163-e-snli-natural-language-inference-with-natural-language-explanations.pdf}
}
``` |
bigscience-catalogue-data/lm_indic-ur_leipzig_wortschatz_urdu-pk_web_2019_sentences | Invalid username or password. |
income/robust04-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
autoevaluate/autoeval-eval-emotion-default-fe1aa0-1485654301 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: armandnlp/distilbert-base-uncased-finetuned-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: armandnlp/distilbert-base-uncased-finetuned-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
FidelOdok/SOFA_DOA_10_deg | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '101'
'2': '106'
'3': '112'
'4': '117'
'5': '122'
'6': '129'
'7': '134'
'8': '137'
'9': '139'
'10': '151'
'11': '156'
'12': '166'
'13': '169'
'14': '171'
'15': '172'
'16': '18'
'17': '182'
'18': '187'
'19': '189'
'20': '190'
'21': '192'
'22': '200'
'23': '205'
'24': '207'
'25': '209'
'26': '211'
'27': '218'
'28': '219'
'29': '221'
'30': '224'
'31': '226'
'32': '227'
'33': '229'
'34': '237'
'35': '239'
'36': '242'
'37': '244'
'38': '257'
'39': '26'
'40': '260'
'41': '262'
'42': '265'
'43': '278'
'44': '281'
'45': '3'
'46': '312'
'47': '317'
'48': '328'
'49': '343'
'50': '351'
'51': '354'
'52': '356'
'53': '358'
'54': '359'
'55': '368'
'56': '369'
'57': '371'
'58': '372'
'59': '373'
'60': '378'
'61': '380'
'62': '383'
'63': '385'
'64': '386'
'65': '391'
'66': '394'
'67': '397'
'68': '4'
'69': '422'
'70': '423'
'71': '424'
'72': '426'
'73': '427'
'74': '428'
'75': '46'
'76': '49'
'77': '5'
'78': '50'
'79': '58'
'80': '6'
'81': '66'
'82': '67'
'83': '69'
'84': '7'
'85': '71'
'86': '73'
'87': '82'
'88': '84'
'89': '86'
'90': '87'
'91': '89'
'92': '96'
splits:
- name: train
num_bytes: 21491848138.0
num_examples: 22500
download_size: 999178438
dataset_size: 21491848138.0
---
# Dataset Card for "SOFA_DOA_10_deg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SillyL12324/girls | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 26971470920.248
num_examples: 343222
download_size: 10458353483
dataset_size: 26971470920.248
---
# Dataset Card for "girls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eurlex | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-label-classification
paperswithcode_id: eurlex57k
pretty_name: the EUR-Lex dataset
tags:
- legal-topic-classification
dataset_info:
features:
- name: celex_id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: eurovoc_concepts
sequence: string
config_name: eurlex57k
splits:
- name: train
num_bytes: 167603718
num_examples: 45000
- name: test
num_bytes: 22046706
num_examples: 6000
- name: validation
num_bytes: 21942574
num_examples: 6000
download_size: 50289403
dataset_size: 211592998
---
# Dataset Card for the EUR-Lex dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://nlp.cs.aueb.gr/software_and_datasets/EURLEX57K/
- **Repository:** http://nlp.cs.aueb.gr/software_and_datasets/EURLEX57K/
- **Paper:** https://www.aclweb.org/anthology/P19-1636/
- **Leaderboard:** N/A
- **Point of Contact:** [Ilias Chalkidis](mailto:ihalk@aueb.gr)
### Dataset Summary
EURLEX57K can be viewed as an improved version of the dataset released by Mencia and Furnkranzand (2007), which has been widely used in Large-scale Multi-label Text Classification (LMTC) research, but is less than half the size of EURLEX57K (19.6k documents, 4k EUROVOC labels) and more than ten years old.
EURLEX57K contains 57k legislative documents in English from EUR-Lex (https://eur-lex.europa.eu) with an average length of 727 words. Each document contains four major zones:
- the header, which includes the title and name of the legal body enforcing the legal act;
- the recitals, which are legal background references; and
- the main body, usually organized in articles.
**Labeling / Annotation**
All the documents of the dataset have been annotated by the Publications Office of EU (https://publications.europa.eu/en) with multiple concepts from EUROVOC (http://eurovoc.europa.eu/).
While EUROVOC includes approx. 7k concepts (labels), only 4,271 (59.31%) are present in EURLEX57K, from which only 2,049 (47.97%) have been assigned to more than 10 documents. The 4,271 labels are also divided into frequent (746 labels), few-shot (3,362), and zero- shot (163), depending on whether they were assigned to more than 50, fewer than 50 but at least one, or no training documents, respectively.
### Supported Tasks and Leaderboards
The dataset supports:
**Multi-label Text Classification:** Given the text of a document, a model predicts the relevant EUROVOC concepts.
**Few-shot and Zero-shot learning:** As already noted, the labels can be divided into three groups: frequent (746 labels), few-shot (3,362), and zero- shot (163), depending on whether they were assigned to more than 50, fewer than 50 but at least one, or no training documents, respectively.
### Languages
All documents are written in English.
## Dataset Structure
### Data Instances
```json
{
"celex_id": "31979D0509",
"title": "79/509/EEC: Council Decision of 24 May 1979 on financial aid from the Community for the eradication of African swine fever in Spain",
"text": "COUNCIL DECISION of 24 May 1979 on financial aid from the Community for the eradication of African swine fever in Spain (79/509/EEC)\nTHE COUNCIL OF THE EUROPEAN COMMUNITIES\nHaving regard to the Treaty establishing the European Economic Community, and in particular Article 43 thereof,\nHaving regard to the proposal from the Commission (1),\nHaving regard to the opinion of the European Parliament (2),\nWhereas the Community should take all appropriate measures to protect itself against the appearance of African swine fever on its territory;\nWhereas to this end the Community has undertaken, and continues to undertake, action designed to contain outbreaks of this type of disease far from its frontiers by helping countries affected to reinforce their preventive measures ; whereas for this purpose Community subsidies have already been granted to Spain;\nWhereas these measures have unquestionably made an effective contribution to the protection of Community livestock, especially through the creation and maintenance of a buffer zone north of the river Ebro;\nWhereas, however, in the opinion of the Spanish authorities themselves, the measures so far implemented must be reinforced if the fundamental objective of eradicating the disease from the entire country is to be achieved;\nWhereas the Spanish authorities have asked the Community to contribute to the expenses necessary for the efficient implementation of a total eradication programme;\nWhereas a favourable response should be given to this request by granting aid to Spain, having regard to the undertaking given by that country to protect the Community against African swine fever and to eliminate completely this disease by the end of a five-year eradication plan;\nWhereas this eradication plan must include certain measures which guarantee the effectiveness of the action taken, and it must be possible to adapt these measures to developments in the situation by means of a procedure establishing close cooperation between the Member States and the Commission;\nWhereas it is necessary to keep the Member States regularly informed as to the progress of the action undertaken,",
"eurovoc_concepts": ["192", "2356", "2560", "862", "863"]
}
```
### Data Fields
The following data fields are provided for documents (`train`, `dev`, `test`):
`celex_id`: (**str**) The official ID of the document. The CELEX number is the unique identifier for all publications in both Eur-Lex and CELLAR.\
`title`: (**str**) The title of the document.\
`text`: (**str**) The full content of each document, which is represented by its `header`, `recitals` and `main_body`.\
`eurovoc_concepts`: (**List[str]**) The relevant EUROVOC concepts (labels).
If you want to use the descriptors of EUROVOC concepts, similar to Chalkidis et al. (2020), please load: https://archive.org/download/EURLEX57K/eurovoc_concepts.jsonl
```python
import json
with open('./eurovoc_concepts.jsonl') as jsonl_file:
eurovoc_concepts = {json.loads(concept) for concept in jsonl_file.readlines()}
```
### Data Splits
| Split | No of Documents | Avg. words | Avg. labels |
| ------------------- | ------------------------------------ | --- | --- |
| Train | 45,000 | 729 | 5 |
|Development | 6,000 | 714 | 5 |
|Test | 6,000 | 725 | 5 |
## Dataset Creation
### Curation Rationale
The dataset was curated by Chalkidis et al. (2019).\
The documents have been annotated by the Publications Office of EU (https://publications.europa.eu/en).
### Source Data
#### Initial Data Collection and Normalization
The original data are available at EUR-Lex portal (https://eur-lex.europa.eu) in an unprocessed format.
The documents were downloaded from EUR-Lex portal in HTML format.
The relevant metadata and EUROVOC concepts were downloaded from the SPARQL endpoint of the Publications Office of EU (http://publications.europa.eu/webapi/rdf/sparql).
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
* The original documents are available at EUR-Lex portal (https://eur-lex.europa.eu) in an unprocessed HTML format. The HTML code was striped and the documents split into sections.
* The documents have been annotated by the Publications Office of EU (https://publications.europa.eu/en).
#### Who are the annotators?
Publications Office of EU (https://publications.europa.eu/en)
### Personal and Sensitive Information
The dataset does not include personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Chalkidis et al. (2019)
### Licensing Information
© European Union, 1998-2021
The Commission’s document reuse policy is based on Decision 2011/833/EU. Unless otherwise specified, you can re-use the legal documents published in EUR-Lex for commercial or non-commercial purposes.
The copyright for the editorial content of this website, the summaries of EU legislation and the consolidated texts, which is owned by the EU, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made.
Source: https://eur-lex.europa.eu/content/legal-notice/legal-notice.html \
Read more: https://eur-lex.europa.eu/content/help/faq/reuse-contents-eurlex.html
### Citation Information
*Ilias Chalkidis, Manos Fergadiotis, Prodromos Malakasiotis and Ion Androutsopoulos.*
*Large-Scale Multi-Label Text Classification on EU Legislation.*
*Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019). Florence, Italy. 2019*
```
@inproceedings{chalkidis-etal-2019-large,
title = "Large-Scale Multi-Label Text Classification on {EU} Legislation",
author = "Chalkidis, Ilias and Fergadiotis, Manos and Malakasiotis, Prodromos and Androutsopoulos, Ion",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1636",
doi = "10.18653/v1/P19-1636",
pages = "6314--6322"
}
```
### Contributions
Thanks to [@iliaschalkidis](https://github.com/iliaschalkidis) for adding this dataset. |
Soma8622/upload_test | ---
license: mit
---
# 概要
- [国会会議録検索システム 検索用API](https://kokkai.ndl.go.jp/api.html)を用いて作成したデータセットです。
|
asahikuroki222/bonito_privacy_qa_sft_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2093268
num_examples: 7830
- name: test
num_bytes: 530688
num_examples: 1958
download_size: 1061562
dataset_size: 2623956
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Mihaiii__dolphin-2.6-mistral-7b-dpo-5.93B | ---
pretty_name: Evaluation run of Mihaiii/dolphin-2.6-mistral-7b-dpo-5.93B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mihaiii/dolphin-2.6-mistral-7b-dpo-5.93B](https://huggingface.co/Mihaiii/dolphin-2.6-mistral-7b-dpo-5.93B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__dolphin-2.6-mistral-7b-dpo-5.93B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T23:39:50.457825](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__dolphin-2.6-mistral-7b-dpo-5.93B/blob/main/results_2024-02-29T23-39-50.457825.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2787882344251881,\n\
\ \"acc_stderr\": 0.03159583298843307,\n \"acc_norm\": 0.2808463538110389,\n\
\ \"acc_norm_stderr\": 0.032417699387269676,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5350613699912699,\n\
\ \"mc2_stderr\": 0.015713472335590086\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3506825938566553,\n \"acc_stderr\": 0.013944635930726087,\n\
\ \"acc_norm\": 0.38993174061433444,\n \"acc_norm_stderr\": 0.014252959848892887\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4565823541127266,\n\
\ \"acc_stderr\": 0.0049709334202319285,\n \"acc_norm\": 0.6101374228241386,\n\
\ \"acc_norm_stderr\": 0.004867221634461264\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.038424985593952674,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.038424985593952674\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051982,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051982\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686935,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.027678452578212387,\n\
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.027678452578212387\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.038061426873099935,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.038061426873099935\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.032684540130117436,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.032684540130117436\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2709677419354839,\n \"acc_stderr\": 0.025284416114900156,\n \"\
acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.025284416114900156\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233483,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233483\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380572,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380572\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25871559633027524,\n \"acc_stderr\": 0.018776052319619624,\n \"\
acc_norm\": 0.25871559633027524,\n \"acc_norm_stderr\": 0.018776052319619624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.0302252261600124,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.0302252261600124\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693247,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693247\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3080168776371308,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.22869955156950672,\n\
\ \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.22869955156950672,\n\
\ \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.3247863247863248,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3086816720257235,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.3086816720257235,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.02555765398186806,\n\
\ \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.02555765398186806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26988265971316816,\n\
\ \"acc_stderr\": 0.01133738108425041,\n \"acc_norm\": 0.26988265971316816,\n\
\ \"acc_norm_stderr\": 0.01133738108425041\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.29248366013071897,\n \"acc_stderr\": 0.018403415710109783,\n \
\ \"acc_norm\": 0.29248366013071897,\n \"acc_norm_stderr\": 0.018403415710109783\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n\
\ \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.30845771144278605,\n\
\ \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5350613699912699,\n\
\ \"mc2_stderr\": 0.015713472335590086\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6266771902131019,\n \"acc_stderr\": 0.013594002763035523\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674068\n }\n}\n```"
repo_url: https://huggingface.co/Mihaiii/dolphin-2.6-mistral-7b-dpo-5.93B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|arc:challenge|25_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|gsm8k|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hellaswag|10_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-39-50.457825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T23-39-50.457825.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- '**/details_harness|winogrande|5_2024-02-29T23-39-50.457825.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T23-39-50.457825.parquet'
- config_name: results
data_files:
- split: 2024_02_29T23_39_50.457825
path:
- results_2024-02-29T23-39-50.457825.parquet
- split: latest
path:
- results_2024-02-29T23-39-50.457825.parquet
---
# Dataset Card for Evaluation run of Mihaiii/dolphin-2.6-mistral-7b-dpo-5.93B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/dolphin-2.6-mistral-7b-dpo-5.93B](https://huggingface.co/Mihaiii/dolphin-2.6-mistral-7b-dpo-5.93B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__dolphin-2.6-mistral-7b-dpo-5.93B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T23:39:50.457825](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__dolphin-2.6-mistral-7b-dpo-5.93B/blob/main/results_2024-02-29T23-39-50.457825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2787882344251881,
"acc_stderr": 0.03159583298843307,
"acc_norm": 0.2808463538110389,
"acc_norm_stderr": 0.032417699387269676,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5350613699912699,
"mc2_stderr": 0.015713472335590086
},
"harness|arc:challenge|25": {
"acc": 0.3506825938566553,
"acc_stderr": 0.013944635930726087,
"acc_norm": 0.38993174061433444,
"acc_norm_stderr": 0.014252959848892887
},
"harness|hellaswag|10": {
"acc": 0.4565823541127266,
"acc_stderr": 0.0049709334202319285,
"acc_norm": 0.6101374228241386,
"acc_norm_stderr": 0.004867221634461264
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.038424985593952674,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.038424985593952674
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051982,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051982
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686935,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.027678452578212387,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.027678452578212387
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.038061426873099935,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.038061426873099935
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.032684540130117436,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.032684540130117436
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233483,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233483
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380572,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380572
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25871559633027524,
"acc_stderr": 0.018776052319619624,
"acc_norm": 0.25871559633027524,
"acc_norm_stderr": 0.018776052319619624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.0302252261600124,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.0302252261600124
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693247,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693247
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3080168776371308,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.3080168776371308,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.22869955156950672,
"acc_stderr": 0.028188240046929196,
"acc_norm": 0.22869955156950672,
"acc_norm_stderr": 0.028188240046929196
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3086816720257235,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.3086816720257235,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30246913580246915,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.30246913580246915,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26988265971316816,
"acc_stderr": 0.01133738108425041,
"acc_norm": 0.26988265971316816,
"acc_norm_stderr": 0.01133738108425041
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.29248366013071897,
"acc_stderr": 0.018403415710109783,
"acc_norm": 0.29248366013071897,
"acc_norm_stderr": 0.018403415710109783
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5350613699912699,
"mc2_stderr": 0.015713472335590086
},
"harness|winogrande|5": {
"acc": 0.6266771902131019,
"acc_stderr": 0.013594002763035523
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674068
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
keremberke/aerial-sheep-object-detection | ---
task_categories:
- object-detection
tags:
- roboflow
---
### Roboflow Dataset Page
[https://universe.roboflow.com/riis/aerial-sheep/dataset/1](https://universe.roboflow.com/riis/aerial-sheep/dataset/1?ref=roboflow2huggingface)
### Dataset Labels
```
['sheep']
```
### Citation
```
@misc{ aerial-sheep_dataset,
title = { Aerial Sheep Dataset },
type = { Open Source Dataset },
author = { Riis },
howpublished = { \\url{ https://universe.roboflow.com/riis/aerial-sheep } },
url = { https://universe.roboflow.com/riis/aerial-sheep },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { jun },
note = { visited on 2023-01-02 },
}
```
### License
Public Domain
### Dataset Summary
This dataset was exported via roboflow.com on December 2, 2022 at 4:47 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
It includes 4133 images.
Sheep are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 600x600 (Stretch)
The following augmentation was applied to create 3 versions of each source image:
* 50% probability of horizontal flip
* 50% probability of vertical flip
* Randomly crop between 0 and 20 percent of the image
* Random brigthness adjustment of between -15 and +15 percent
* Random exposure adjustment of between -10 and +10 percent
|
CVasNLPExperiments/VQAv2_sample_validation_benchmarks_partition_3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 58
num_examples: 2
download_size: 1368
dataset_size: 58
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MohitK/indian_food_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1622005508.6394334
num_examples: 5328
- name: test
num_bytes: 251982030.3925666
num_examples: 941
download_size: 1599894487
dataset_size: 1873987539.032
---
# Dataset Card for "indian_food_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sayakpaul/instructpix2pix-demo | ---
dataset_info:
features:
- name: input
dtype: string
- name: edit
dtype: string
- name: output
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 2456199.0
num_examples: 5
download_size: 2460397
dataset_size: 2456199.0
---
# Dataset Card for "instructpix2pix-demo"
Dataset was created using [this notebook](https://colab.research.google.com/gist/sayakpaul/f90aa06f8f89c831f798dd5b3939818b/scratchpad.ipynb).
Paper reference: [InstructPix2Pix: Learning to Follow Image Editing Instructions](https://arxiv.org/abs/2211.09800) |
open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch | ---
pretty_name: Evaluation run of TFLai/Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T20:33:12.863346](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-21T20-33-12.863346.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13223573825503357,\n\
\ \"em_stderr\": 0.003469085098310238,\n \"f1\": 0.18757340604026812,\n\
\ \"f1_stderr\": 0.0035161268788136742,\n \"acc\": 0.3944208648655765,\n\
\ \"acc_stderr\": 0.008340726173222485\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.13223573825503357,\n \"em_stderr\": 0.003469085098310238,\n\
\ \"f1\": 0.18757340604026812,\n \"f1_stderr\": 0.0035161268788136742\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \
\ \"acc_stderr\": 0.004665893134220793\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224176\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T20_33_12.863346
path:
- '**/details_harness|drop|3_2023-10-21T20-33-12.863346.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T20-33-12.863346.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T20_33_12.863346
path:
- '**/details_harness|gsm8k|5_2023-10-21T20-33-12.863346.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T20-33-12.863346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T20_33_12.863346
path:
- '**/details_harness|winogrande|5_2023-10-21T20-33-12.863346.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T20-33-12.863346.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- results_2023-08-29T19:38:59.020077.parquet
- split: 2023_10_21T20_33_12.863346
path:
- results_2023-10-21T20-33-12.863346.parquet
- split: latest
path:
- results_2023-10-21T20-33-12.863346.parquet
---
# Dataset Card for Evaluation run of TFLai/Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T20:33:12.863346](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-21T20-33-12.863346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13223573825503357,
"em_stderr": 0.003469085098310238,
"f1": 0.18757340604026812,
"f1_stderr": 0.0035161268788136742,
"acc": 0.3944208648655765,
"acc_stderr": 0.008340726173222485
},
"harness|drop|3": {
"em": 0.13223573825503357,
"em_stderr": 0.003469085098310238,
"f1": 0.18757340604026812,
"f1_stderr": 0.0035161268788136742
},
"harness|gsm8k|5": {
"acc": 0.029567854435178165,
"acc_stderr": 0.004665893134220793
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224176
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_kaitchup__TheMayonnaise | ---
pretty_name: Evaluation run of kaitchup/TheMayonnaise
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kaitchup/TheMayonnaise](https://huggingface.co/kaitchup/TheMayonnaise) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kaitchup__TheMayonnaise\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T00:57:32.394411](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__TheMayonnaise/blob/main/results_2024-01-28T00-57-32.394411.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.654888254301867,\n\
\ \"acc_stderr\": 0.032005734315972555,\n \"acc_norm\": 0.6542921987893688,\n\
\ \"acc_norm_stderr\": 0.032673839464175965,\n \"mc1\": 0.5605875152998776,\n\
\ \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.6919294325525855,\n\
\ \"mc2_stderr\": 0.015143200911624674\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n\
\ \"acc_norm\": 0.734641638225256,\n \"acc_norm_stderr\": 0.01290255476231396\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7184823740290779,\n\
\ \"acc_stderr\": 0.00448820175664258,\n \"acc_norm\": 0.8845847440748855,\n\
\ \"acc_norm_stderr\": 0.0031886940284536333\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"\
acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n\
\ \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.6919294325525855,\n\
\ \"mc2_stderr\": 0.015143200911624674\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.012696930106562912\n }\n}\n```"
repo_url: https://huggingface.co/kaitchup/TheMayonnaise
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|arc:challenge|25_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|gsm8k|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hellaswag|10_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T00-57-32.394411.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T00-57-32.394411.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- '**/details_harness|winogrande|5_2024-01-28T00-57-32.394411.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T00-57-32.394411.parquet'
- config_name: results
data_files:
- split: 2024_01_28T00_57_32.394411
path:
- results_2024-01-28T00-57-32.394411.parquet
- split: latest
path:
- results_2024-01-28T00-57-32.394411.parquet
---
# Dataset Card for Evaluation run of kaitchup/TheMayonnaise
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kaitchup/TheMayonnaise](https://huggingface.co/kaitchup/TheMayonnaise) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kaitchup__TheMayonnaise",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T00:57:32.394411](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__TheMayonnaise/blob/main/results_2024-01-28T00-57-32.394411.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.654888254301867,
"acc_stderr": 0.032005734315972555,
"acc_norm": 0.6542921987893688,
"acc_norm_stderr": 0.032673839464175965,
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.6919294325525855,
"mc2_stderr": 0.015143200911624674
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.01327307786590759,
"acc_norm": 0.734641638225256,
"acc_norm_stderr": 0.01290255476231396
},
"harness|hellaswag|10": {
"acc": 0.7184823740290779,
"acc_stderr": 0.00448820175664258,
"acc_norm": 0.8845847440748855,
"acc_norm_stderr": 0.0031886940284536333
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156861,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156861
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.01659339422756484,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.01659339422756484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.6919294325525855,
"mc2_stderr": 0.015143200911624674
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.012696930106562912
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
paullatham1/tweets-test-balanced | ---
dataset_info:
features:
- name: data
dtype: string
- name: is_sarcastic
dtype: int64
splits:
- name: train
num_bytes: 361009
num_examples: 3718
download_size: 226138
dataset_size: 361009
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-tweet_eval-sentiment-be35d9-30474144941 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- tweet_eval
eval_info:
task: multi_class_classification
model: cardiffnlp/twitter-roberta-base-sentiment-latest
metrics: []
dataset_name: tweet_eval
dataset_config: sentiment
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: cardiffnlp/twitter-roberta-base-sentiment-latest
* Dataset: tweet_eval
* Config: sentiment
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ericbugin](https://huggingface.co/ericbugin) for evaluating this model. |
yyc777/shenzhen_door | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': door_close
'1': door_open
splits:
- name: train
num_bytes: 4001809.0
num_examples: 130
- name: validation
num_bytes: 490623.0
num_examples: 16
download_size: 4508432
dataset_size: 4492432.0
---
# Dataset Card for "shenzhen_door"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MartinKu/wikipedia_stage2_coverage_20230331 | ---
dataset_info:
features:
- name: text
dtype: string
- name: S_V_position
sequence: int64
- name: O_C_position
sequence: int64
- name: start_point_list
sequence: int64
splits:
- name: train
num_bytes: 60121841137
num_examples: 1089628
download_size: 18558792841
dataset_size: 60121841137
---
# Dataset Card for "wikipedia_stage2_coverage_20230331"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/summeval | ---
language:
- en
---
# SummEval
The annotations include summaries generated by 16 models from 100 source news articles (1600 examples in total).
Each of the summaries was annotated by 5 indepedent crowdsource workers and 3 independent experts (8 annotations in total).
Summaries were evaluated across 4 dimensions: coherence, consistency, fluency, relevance.
Each source news article comes with the original reference from the CNN/DailyMail dataset and 10 additional crowdsources reference summaries.
For this dataset, we averaged the 3 **expert** annotations to get the human scores.
source: https://github.com/Yale-LILY/SummEval |
Intuit-GenSRF/hate-speech18 | ---
dataset_info:
features:
- name: text
dtype: string
- name: user_id
dtype: int64
- name: subforum_id
dtype: int64
- name: num_contexts
dtype: int64
- name: labels
sequence: string
splits:
- name: train
num_bytes: 1343052
num_examples: 10944
download_size: 772056
dataset_size: 1343052
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hate_speech18"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bartoszmaj/sentiment_two | ---
license: openrail
dataset_info:
features:
- name: sen
struct:
- name: compound
dtype: float64
- name: neg
dtype: float64
- name: neu
dtype: float64
- name: pos
dtype: float64
splits:
- name: train
num_bytes: 32000000
num_examples: 1000000
download_size: 14801738
dataset_size: 32000000
---
|
CyberHarem/yashiya_yui_rokudounoonnatachi | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yashiya Yui (Rokudou no Onna-tachi)
This is the dataset of Yashiya Yui (Rokudou no Onna-tachi), containing 63 images and their tags.
The core tags of this character are `red_hair, long_hair, hair_over_one_eye, hair_ornament, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 63 | 50.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 63 | 37.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 118 | 67.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 63 | 50.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 118 | 87.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yashiya_yui_rokudounoonnatachi',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, smile, blush, brown_eyes, hairclip, open_mouth, parody, asymmetrical_bangs, looking_at_viewer, portrait |
| 1 | 19 |  |  |  |  |  | 1girl, cleavage, tied_shirt, midriff, solo, navel, plaid_skirt, yellow_shirt, looking_at_viewer, red_eyes, red_skirt, smile |
| 2 | 7 |  |  |  |  |  | 1girl, cleavage, hairclip, 1boy, chain-link_fence, formal, jacket, white_shirt, skirt, suit, thighhighs, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | blush | brown_eyes | hairclip | open_mouth | parody | asymmetrical_bangs | looking_at_viewer | portrait | cleavage | tied_shirt | midriff | navel | plaid_skirt | yellow_shirt | red_eyes | red_skirt | 1boy | chain-link_fence | formal | jacket | white_shirt | skirt | suit | thighhighs | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:-------------|:-----------|:-------------|:---------|:---------------------|:--------------------|:-----------|:-----------|:-------------|:----------|:--------|:--------------|:---------------|:-----------|:------------|:-------|:-------------------|:---------|:---------|:--------------|:--------|:-------|:-------------|:-----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | | X | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X |
|
ferrorist/20240327_korean_dataset_v03 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 119569266
num_examples: 239809
download_size: 65438128
dataset_size: 119569266
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LiveEvil/lucyrev1 | ---
license: apache-2.0
---
|
pe-nlp/ov-kit-files | ---
dataset_info:
features:
- name: file_path
dtype: string
- name: data
dtype: string
splits:
- name: train
num_bytes: 371874760
num_examples: 13135
download_size: 141403660
dataset_size: 371874760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
urvog/llama2_transcripts_healthcare_callcenter | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2979647
num_examples: 1000
download_size: 0
dataset_size: 2979647
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
louisbrulenaudet/livre-procedures-fiscales | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Livre des procédures fiscales
source_datasets:
- original
pretty_name: Livre des procédures fiscales
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Livre des procédures fiscales, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
xuanyanhui/Test | ---
license: mit
---
|
JisuofthePark/UNEEK_ESL | ---
task_categories:
- feature-extraction
language:
- en
--- |
dongyoung4091/hh-generated_flan_t5_large_logax | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
sequence: string
- name: log_probs_google/flan-t5-large
sequence: float64
- name: log_probs_google/flan-t5-xl
sequence: float64
splits:
- name: train
num_bytes: 1816277
num_examples: 100
download_size: 900186
dataset_size: 1816277
---
# Dataset Card for "hh-generated_flan_t5_large_logax"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrewdotcom/aus-agronomy | ---
license: mit
language:
- en
tags:
- agriculture
- agronomy
- australia
pretty_name: Australian Agronomy Data
---
# Australian Agronomy Data
This is a collection of various agronomy datasets that I have generated to support my work in the development of RAG pipelines and Fine Tuned Foundation Models for use in Australian Agriculture. |
bigscience-data/roots_es_ted_talks_iwslt | ---
language: es
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_es_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
cp500/CT-samples | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 748569212
num_examples: 582951
download_size: 302697027
dataset_size: 748569212
---
# Dataset Card for "CT-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nadav/pixel_glue_rte_high_noise | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: validation
num_bytes: 14157030.0
num_examples: 277
download_size: 14146936
dataset_size: 14157030.0
---
# Dataset Card for "pixel_glue_rte_high_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abasu41/CQLTrainer | ---
license: apache-2.0
dataset_info:
features:
- name: name
dtype: string
- name: oid
dtype: string
splits:
- name: train
num_bytes: 3072
num_examples: 47
download_size: 2786
dataset_size: 3072
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_abhishek__autotrain-c71ux-tngfu | ---
pretty_name: Evaluation run of abhishek/autotrain-c71ux-tngfu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishek/autotrain-c71ux-tngfu](https://huggingface.co/abhishek/autotrain-c71ux-tngfu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishek__autotrain-c71ux-tngfu\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:38:22.374324](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-c71ux-tngfu/blob/main/results_2024-03-29T21-38-22.374324.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/abhishek/autotrain-c71ux-tngfu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-38-22.374324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-38-22.374324.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- '**/details_harness|winogrande|5_2024-03-29T21-38-22.374324.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-38-22.374324.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_38_22.374324
path:
- results_2024-03-29T21-38-22.374324.parquet
- split: latest
path:
- results_2024-03-29T21-38-22.374324.parquet
---
# Dataset Card for Evaluation run of abhishek/autotrain-c71ux-tngfu
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishek/autotrain-c71ux-tngfu](https://huggingface.co/abhishek/autotrain-c71ux-tngfu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishek__autotrain-c71ux-tngfu",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:38:22.374324](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-c71ux-tngfu/blob/main/results_2024-03-29T21-38-22.374324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-14b-v20.1-32k | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-qwen1.5-14b-v20.1-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-qwen1.5-14b-v20.1-32k](https://huggingface.co/OpenBuddy/openbuddy-qwen1.5-14b-v20.1-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-14b-v20.1-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-25T06:45:02.768859](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-14b-v20.1-32k/blob/main/results_2024-03-25T06-45-02.768859.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6534834160396696,\n\
\ \"acc_stderr\": 0.03176444741162498,\n \"acc_norm\": 0.666865359936279,\n\
\ \"acc_norm_stderr\": 0.03262265630434004,\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.01698703926614299,\n \"mc2\": 0.5428075906372429,\n\
\ \"mc2_stderr\": 0.01528813077773689\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5204778156996587,\n \"acc_stderr\": 0.014599131353035009,\n\
\ \"acc_norm\": 0.5691126279863481,\n \"acc_norm_stderr\": 0.01447113339264247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5611431985660227,\n\
\ \"acc_stderr\": 0.004952332378120329,\n \"acc_norm\": 0.7456681935869349,\n\
\ \"acc_norm_stderr\": 0.004345949382382374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5264550264550265,\n \"acc_stderr\": 0.025715239811346758,\n \"\
acc_norm\": 0.5264550264550265,\n \"acc_norm_stderr\": 0.025715239811346758\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554935,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554935\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6157635467980296,\n \"acc_stderr\": 0.0342239856565755,\n \"acc_norm\"\
: 0.6157635467980296,\n \"acc_norm_stderr\": 0.0342239856565755\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781647,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781647\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942074,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942074\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463089,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463089\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361266,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361266\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944216,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944216\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508762,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508762\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011617,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011617\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n\
\ \"acc_stderr\": 0.012757683047716175,\n \"acc_norm\": 0.47783572359843546,\n\
\ \"acc_norm_stderr\": 0.012757683047716175\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000314,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000314\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178817,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178817\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.01698703926614299,\n \"mc2\": 0.5428075906372429,\n\
\ \"mc2_stderr\": 0.01528813077773689\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930687\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-qwen1.5-14b-v20.1-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|arc:challenge|25_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|gsm8k|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hellaswag|10_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T06-45-02.768859.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T06-45-02.768859.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- '**/details_harness|winogrande|5_2024-03-25T06-45-02.768859.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-25T06-45-02.768859.parquet'
- config_name: results
data_files:
- split: 2024_03_25T06_45_02.768859
path:
- results_2024-03-25T06-45-02.768859.parquet
- split: latest
path:
- results_2024-03-25T06-45-02.768859.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-qwen1.5-14b-v20.1-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-qwen1.5-14b-v20.1-32k](https://huggingface.co/OpenBuddy/openbuddy-qwen1.5-14b-v20.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-14b-v20.1-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-25T06:45:02.768859](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-14b-v20.1-32k/blob/main/results_2024-03-25T06-45-02.768859.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6534834160396696,
"acc_stderr": 0.03176444741162498,
"acc_norm": 0.666865359936279,
"acc_norm_stderr": 0.03262265630434004,
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614299,
"mc2": 0.5428075906372429,
"mc2_stderr": 0.01528813077773689
},
"harness|arc:challenge|25": {
"acc": 0.5204778156996587,
"acc_stderr": 0.014599131353035009,
"acc_norm": 0.5691126279863481,
"acc_norm_stderr": 0.01447113339264247
},
"harness|hellaswag|10": {
"acc": 0.5611431985660227,
"acc_stderr": 0.004952332378120329,
"acc_norm": 0.7456681935869349,
"acc_norm_stderr": 0.004345949382382374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5264550264550265,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.5264550264550265,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554935,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781647,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781647
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942074,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942074
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463089,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463089
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361266,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944216,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944216
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508762,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508762
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011617,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011617
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716175,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716175
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000314,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000314
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178817,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178817
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.01698703926614299,
"mc2": 0.5428075906372429,
"mc2_stderr": 0.01528813077773689
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.012160189196930687
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/lmind_hotpot_train5000_eval5000_v1_docidx | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 864508
num_examples: 5000
- name: train_recite_qa
num_bytes: 5350190
num_examples: 5000
- name: eval_qa
num_bytes: 813536
num_examples: 5000
- name: eval_recite_qa
num_bytes: 5394796
num_examples: 5000
- name: all_docs
num_bytes: 8524332
num_examples: 18224
- name: all_docs_eval
num_bytes: 8523131
num_examples: 18224
- name: train
num_bytes: 8524332
num_examples: 18224
- name: validation
num_bytes: 8523131
num_examples: 18224
download_size: 28560941
dataset_size: 46517956
---
# Dataset Card for "lmind_hotpot_train5000_eval5000_v1_docidx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonasantos5240/vozleon | ---
license: openrail
---
|
davanstrien/BANSpEmo | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: mfcc1
dtype: float64
- name: mfcc2
dtype: float64
- name: mfcc3
dtype: float64
- name: mfcc4
dtype: float64
- name: mfcc5
dtype: float64
- name: mfcc6
dtype: float64
- name: mfcc7
dtype: float64
- name: mfcc8
dtype: float64
- name: mfcc9
dtype: float64
- name: mfcc10
dtype: float64
- name: mfcc11
dtype: float64
- name: mfcc12
dtype: float64
- name: mfcc13
dtype: float64
- name: mfcc14
dtype: float64
- name: mfcc15
dtype: float64
- name: mfcc16
dtype: float64
- name: mfcc17
dtype: float64
- name: mfcc18
dtype: float64
- name: mfcc19
dtype: float64
- name: mfcc20
dtype: float64
- name: class
dtype: int64
- name: Class
dtype: string
splits:
- name: train
num_bytes: 912767495.0
num_examples: 792
download_size: 819764491
dataset_size: 912767495.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Qmh/lerf_ovs | ---
license: bsd-2-clause
---
|
jtatman/medqa_train_instruction_format | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 32228274
num_examples: 10178
download_size: 0
dataset_size: 32228274
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "medqa_train_instruction_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/neuclir-2022-zho | ---
language:
- zho
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- neuclir
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_examples: 36575
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_examples: 3179209
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_examples: 114
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
---
From the NeuCLIR TREC Track 2022: https://arxiv.org/abs/2304.12367
Generated from https://huggingface.co/datasets/neuclir/neuclir1
```
@article{lawrie2023overview,
title={Overview of the TREC 2022 NeuCLIR track},
author={Lawrie, Dawn and MacAvaney, Sean and Mayfield, James and McNamee, Paul and Oard, Douglas W and Soldaini, Luca and Yang, Eugene},
journal={arXiv preprint arXiv:2304.12367},
year={2023}
}
```
|
jbaker/FBW1Marketing | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 12968
num_examples: 10
download_size: 18583
dataset_size: 12968
---
# Dataset Card for "FBW1Marketing"
Synthetic dataset created with GPT-4 for FourthBrain Building with LLMs Week 1 Assignment
Contains 10 product/marketing email pairs:
* Product
* Short Description
* Marketing Email |
patrickvonplaten/dummy_image_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1944983.0
num_examples: 20
download_size: 1690123
dataset_size: 1944983.0
---
# Dataset Card for "dummy_image_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lyhue1991/scut_data | ---
license: apache-2.0
---
|
Databasesprojec/FinStmts_ConsUncons_English_EU_Predict_part_5 | ---
dataset_info:
features:
- name: label
dtype: int64
- name: id
dtype: string
- name: language
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4603217624
num_examples: 10882
download_size: 2130610005
dataset_size: 4603217624
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-futin__guess-en-78963b-2087067145 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-7b1
metrics: []
dataset_name: futin/guess
dataset_config: en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-7b1
* Dataset: futin/guess
* Config: en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
nrhone/male-black-hairstyles | ---
license: mit
---
Dataset for a LORA that will help stable diffusion identify different black male hairstyles. |
yzhuang/metatree_BNG_pendigits_ | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 103544056
num_examples: 699622
- name: validation
num_bytes: 44455944
num_examples: 300378
download_size: 145615949
dataset_size: 148000000
---
# Dataset Card for "metatree_BNG_pendigits_"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Application_100K | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: log
dtype: string
splits:
- name: train
num_bytes: 28344270
num_examples: 90000
- name: validation
num_bytes: 3073127
num_examples: 10000
download_size: 6304355
dataset_size: 31417397
---
# Dataset Card for "System_100K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jdabello/yahoo_answers_topics | ---
dataset_info:
features:
- name: id
dtype: int32
- name: topic
dtype: string
- name: question_title
dtype: string
- name: question_content
dtype: string
- name: best_answer
dtype: string
splits:
- name: train
num_bytes: 778905695
num_examples: 1400000
download_size: 511657090
dataset_size: 778905695
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "yahoo_answers_topics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.