datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
liuyanchen1015/MULTI_VALUE_mrpc_existential_it | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 10238
num_examples: 40
- name: train
num_bytes: 18402
num_examples: 74
- name: validation
num_bytes: 3248
num_examples: 13
download_size: 32779
dataset_size: 31888
---
# Dataset Card for "MULTI_VALUE_mrpc_existential_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sedthh/ubuntu_dialogue_qa | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
- name: METADATA
dtype: string
splits:
- name: train
num_bytes: 4021291
num_examples: 16181
download_size: 2157548
dataset_size: 4021291
license: mit
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- ubuntu
- forum
- linux
- chat
pretty_name: Q&A from the Ubuntu Dialogue Corpus
size_categories:
- 10K<n<100K
---
# Dataset Card for "ubuntu_dialogue_qa"
Filtered the Ubuntu dialogue chatlogs from https://www.kaggle.com/datasets/rtatman/ubuntu-dialogue-corpus to include Q&A pairs **ONLY**
**Acknowledgements**
This dataset was ORIGINALLY collected by Ryan Lowe, Nissan Pow , Iulian V. Serban† and Joelle Pineau. It is made available here under the Apache License, 2.0. If you use this data in your work, please include the following citation:
Ryan Lowe, Nissan Pow, Iulian V. Serban and Joelle Pineau, "The Ubuntu Dialogue Corpus: A Large Dataset for Research in Unstructured Multi-Turn Dialogue Systems", SIGDial 2015. URL: http://www.sigdial.org/workshops/conference16/proceedings/pdf/SIGDIAL40.pdf |
Deojoandco/ah_openai_dialog_val_test | ---
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: string
- name: num_comments
dtype: int64
- name: name
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: score
dtype: int64
- name: upvote_ratio
dtype: float64
- name: distinguished
dtype: string
- name: over_18
dtype: bool
- name: created_utc
dtype: int64
- name: comments
list:
- name: body
dtype: string
- name: created_utc
dtype: float64
- name: distinguished
dtype: string
- name: id
dtype: string
- name: permalink
dtype: string
- name: score
dtype: int64
- name: best_num_comments
dtype: int64
- name: query
dtype: string
- name: dialog
dtype: string
splits:
- name: train
num_bytes: 5806721
num_examples: 585
download_size: 3436725
dataset_size: 5806721
---
# Dataset Card for "ah_openai_dialog_val_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713192460 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 54006
num_examples: 124
download_size: 24135
dataset_size: 54006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-c50da3-1597456331 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-1.3b
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-1.3b
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
open-llm-leaderboard/details_CultriX__NeuralTrix-7B-v1 | ---
pretty_name: Evaluation run of CultriX/NeuralTrix-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CultriX/NeuralTrix-7B-v1](https://huggingface.co/CultriX/NeuralTrix-7B-v1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__NeuralTrix-7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T21:30:36.893900](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__NeuralTrix-7B-v1/blob/main/results_2024-02-09T21-30-36.893900.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6517284583600502,\n\
\ \"acc_stderr\": 0.03206274673872914,\n \"acc_norm\": 0.6512941433871612,\n\
\ \"acc_norm_stderr\": 0.032730699229841946,\n \"mc1\": 0.605875152998776,\n\
\ \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7487336484598718,\n\
\ \"mc2_stderr\": 0.014341386962976644\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7184300341296929,\n \"acc_stderr\": 0.01314337673500902,\n\
\ \"acc_norm\": 0.7414675767918089,\n \"acc_norm_stderr\": 0.012794553754288694\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7245568611830313,\n\
\ \"acc_stderr\": 0.0044582429605568115,\n \"acc_norm\": 0.8926508663612827,\n\
\ \"acc_norm_stderr\": 0.0030892396746331585\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n\
\ \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.45363128491620114,\n\
\ \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.605875152998776,\n\
\ \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7487336484598718,\n\
\ \"mc2_stderr\": 0.014341386962976644\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6709628506444276,\n \
\ \"acc_stderr\": 0.012942375603679376\n }\n}\n```"
repo_url: https://huggingface.co/CultriX/NeuralTrix-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-30-36.893900.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-30-36.893900.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- '**/details_harness|winogrande|5_2024-02-09T21-30-36.893900.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T21-30-36.893900.parquet'
- config_name: results
data_files:
- split: 2024_02_09T21_30_36.893900
path:
- results_2024-02-09T21-30-36.893900.parquet
- split: latest
path:
- results_2024-02-09T21-30-36.893900.parquet
---
# Dataset Card for Evaluation run of CultriX/NeuralTrix-7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CultriX/NeuralTrix-7B-v1](https://huggingface.co/CultriX/NeuralTrix-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CultriX__NeuralTrix-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T21:30:36.893900](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__NeuralTrix-7B-v1/blob/main/results_2024-02-09T21-30-36.893900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6517284583600502,
"acc_stderr": 0.03206274673872914,
"acc_norm": 0.6512941433871612,
"acc_norm_stderr": 0.032730699229841946,
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7487336484598718,
"mc2_stderr": 0.014341386962976644
},
"harness|arc:challenge|25": {
"acc": 0.7184300341296929,
"acc_stderr": 0.01314337673500902,
"acc_norm": 0.7414675767918089,
"acc_norm_stderr": 0.012794553754288694
},
"harness|hellaswag|10": {
"acc": 0.7245568611830313,
"acc_stderr": 0.0044582429605568115,
"acc_norm": 0.8926508663612827,
"acc_norm_stderr": 0.0030892396746331585
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45363128491620114,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.45363128491620114,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7487336484598718,
"mc2_stderr": 0.014341386962976644
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.6709628506444276,
"acc_stderr": 0.012942375603679376
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dllllb/retailhero-uplift | ---
pretty_name: X5 RetailHero Uplift Modelling
task_categories:
- tabular-classification
tags:
- finance
configs:
- config_name: clients
data_files: data/clients.csv.gz
- config_name: products
data_files: data/products.csv.gz
- config_name: purchases
data_files: data/purchases.csv.gz
- config_name: uplift_train
data_files: data/uplift_train.csv.gz
- config_name: uplift_test
data_files: data/uplift_test.csv.gz
---
https://ods.ai/competitions/x5-retailhero-uplift-modeling |
ShenaoZ/0.0001_idpo_same_3iters_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: is_better
dtype: bool
splits:
- name: train_prefs_1
num_bytes: 169548771
num_examples: 20378
- name: test_prefs_1
num_bytes: 16517400
num_examples: 2000
- name: train_prefs_2
num_bytes: 173115231
num_examples: 20378
- name: test_prefs_2
num_bytes: 16786326
num_examples: 2000
download_size: 207636393
dataset_size: 375967728
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_2
path: data/train_prefs_2-*
- split: test_prefs_2
path: data/test_prefs_2-*
---
# Dataset Card for "0.0001_idpo_same_3iters_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AI-Growth-Lab/patents_claims_1.5m_traim_test_embeddings | ---
license: other
---
|
Rakshit122/za1aaaaa11 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 46270
num_examples: 226
download_size: 16707
dataset_size: 46270
---
# Dataset Card for "za1aaaaa11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dost-asti/Embeddings | ---
task_categories:
- feature-extraction
---
## Model Description
As part of the ITANONG project's 10 billion-token Tagalog dataset, we have introduced a collection of pre-trained embedding models. These models were trained using the Formal text dataset from the renowned corpus which has been thoroughly detailed in our paper. Details of the embedding models can be seen below:
| **Embedding Technique** | **Variant** | **Model File Format** | **Embedding Size** |
|:-----------------------:|:-----------:|:---------------------:|:------------------:|
| Word2Vec | Skipgram | .bin | 20 |
| Word2Vec | Skipgram | .bin | 30 |
| Word2Vec | Skipgram | .bin | 50 |
| Word2Vec | Skipgram | .bin | 100 |
| Word2Vec | Skipgram | .bin | 200 |
| Word2Vec | Skipgram | .bin | 300 |
| Word2Vec | Skipgram | .txt | 20 |
| Word2Vec | Skipgram | .txt | 30 |
| Word2Vec | Skipgram | .txt | 50 |
| Word2Vec | Skipgram | .txt | 100 |
| Word2Vec | Skipgram | .txt | 200 |
| Word2Vec | Skipgram | .txt | 300 |
| Word2Vec | CBOW | .bin | 20 |
| Word2Vec | CBOW | .bin | 30 |
| Word2Vec | CBOW | .bin | 50 |
| Word2Vec | CBOW | .bin | 100 |
| Word2Vec | CBOW | .bin | 200 |
| Word2Vec | CBOW | .bin | 300 |
| Word2Vec | CBOW | .txt | 20 |
| Word2Vec | CBOW | .txt | 30 |
| Word2Vec | CBOW | .txt | 50 |
| Word2Vec | CBOW | .txt | 100 |
| Word2Vec | CBOW | .txt | 200 |
| Word2Vec | CBOW | .txt | 300 |
| FastText | Skipgram | .bin | 20 |
| FastText | Skipgram | .bin | 30 |
| FastText | Skipgram | .bin | 50 |
| FastText | Skipgram | .bin | 100 |
| FastText | Skipgram | .bin | 200 |
| FastText | Skipgram | .bin | 300 |
| FastText | Skipgram | .txt | 20 |
| FastText | Skipgram | .txt | 30 |
| FastText | Skipgram | .txt | 50 |
| FastText | Skipgram | .txt | 100 |
| FastText | Skipgram | .txt | 200 |
| FastText | Skipgram | .txt | 300 |
| FastText | CBOW | .bin | 20 |
| FastText | CBOW | .bin | 30 |
| FastText | CBOW | .bin | 50 |
| FastText | CBOW | .bin | 100 |
| FastText | CBOW | .bin | 200 |
| FastText | CBOW | .bin | 300 |
| FastText | CBOW | .txt | 20 |
| FastText | CBOW | .txt | 30 |
| FastText | CBOW | .txt | 50 |
| FastText | CBOW | .txt | 100 |
| FastText | CBOW | .txt | 200 |
| FastText | CBOW | .txt | 300 |
## Training Details
This model was trained using an Nvidia V100-32GB GPU on DOST-ASTI Computing and Archiving Research Environment (COARE) - https://asti.dost.gov.ph/projects/coare/
### Training Data
The training dataset was compiled from both formal and informal sources, consisting of 194,001 instances from formal channels. More information on pre-processing and training parameters on our paper.
## Citation
Paper : iTANONG-DS : A Collection of Benchmark Datasets for Downstream Natural Language Processing Tasks on Select Philippine Language
Bibtex:
```
@inproceedings{visperas-etal-2023-itanong,
title = "i{TANONG}-{DS} : A Collection of Benchmark Datasets for Downstream Natural Language Processing Tasks on Select {P}hilippine Languages",
author = "Visperas, Moses L. and
Borjal, Christalline Joie and
Adoptante, Aunhel John M and
Abacial, Danielle Shine R. and
Decano, Ma. Miciella and
Peramo, Elmer C",
editor = "Abbas, Mourad and
Freihat, Abed Alhakim",
booktitle = "Proceedings of the 6th International Conference on Natural Language and Speech Processing (ICNLSP 2023)",
month = dec,
year = "2023",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.icnlsp-1.34",
pages = "316--323",
}
``` |
tyzhu/squad_rare_v4_train_30_eval_10_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 673207
num_examples: 368
- name: validation
num_bytes: 82956
num_examples: 50
download_size: 136492
dataset_size: 756163
---
# Dataset Card for "squad_rare_v4_train_30_eval_10_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Memis/turkishReviews-ds-mini | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: review
dtype: string
- name: review_lenght
dtype: int64
splits:
- name: train
num_bytes: 12474743.36813378
num_examples: 33637
- name: validation
num_bytes: 1386288.6318662209
num_examples: 3738
download_size: 0
dataset_size: 13861032.0
---
# Dataset Card for "turkishReviews-ds-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vlofgren/cabrita-and-guanaco-PTBR | ---
license: openrail
---
|
one-sec-cv12/chunk_79 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24403395600.625
num_examples: 254075
download_size: 22771478558
dataset_size: 24403395600.625
---
# Dataset Card for "chunk_79"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_second_sent_train_100_eval_10_hint10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 273137
num_examples: 210
- name: validation
num_bytes: 10682
num_examples: 10
download_size: 142218
dataset_size: 283819
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_second_sent_train_100_eval_10_hint10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DreamCloudWalker/SDXL_Dreambooth | ---
license: apache-2.0
---
|
enoreyes/imdb_3000_sphere | ---
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: imdb_3000
size_categories:
- 1K<n<10K
---
# Dataset Card for IMDB 3000 Sphere
- **Homepage:** [http://ai.stanford.edu/~amaas/data/sentiment/](http://ai.stanford.edu/~amaas/data/sentiment/)
## Dataset Summary
Large Movie Review Dataset.
This is a 3000 item selection from the `imdb` dataset for binary sentiment classification for use in the Sphere course on AutoTrain.
## Dataset Structure
An example of 'train' looks as follows.
```
{
"label": 0,
"text": "Goodbye world2\n"
}
``` |
mennis88/Alina | ---
license: other
---
|
Mzh666/DONGZU | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 88067000.0
num_examples: 124
download_size: 55259718
dataset_size: 88067000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sadiksha/sem-eval | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
sequence: string
- name: polarity
sequence: string
- name: joint
sequence: string
- name: category_labels
sequence: int64
- name: polarity_labels
sequence: int64
- name: joint_labels
sequence: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 887730
num_examples: 2853
- name: test
num_bytes: 236239
num_examples: 749
download_size: 168506
dataset_size: 1123969
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
cj-mills/labelme-keypoint-eyes-noses-dataset | ---
license: mit
---
|
ethos | ---
annotations_creators:
- crowdsourced
- expert-generated
language_creators:
- found
- other
language:
- en
license:
- agpl-3.0
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-label-classification
- sentiment-classification
paperswithcode_id: ethos
pretty_name: onlinE haTe speecH detectiOn dataSet
tags:
- Hate Speech Detection
dataset_info:
- config_name: binary
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': no_hate_speech
'1': hate_speech
splits:
- name: train
num_bytes: 124823
num_examples: 998
download_size: 123919
dataset_size: 124823
- config_name: multilabel
features:
- name: text
dtype: string
- name: violence
dtype:
class_label:
names:
'0': not_violent
'1': violent
- name: directed_vs_generalized
dtype:
class_label:
names:
'0': generalied
'1': directed
- name: gender
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
- name: race
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
- name: national_origin
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
- name: disability
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
- name: religion
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
- name: sexual_orientation
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 79112
num_examples: 433
download_size: 62836
dataset_size: 79112
config_names:
- binary
- multilabel
---
# Dataset Card for Ethos
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [ETHOS Hate Speech Dataset](https://github.com/intelligence-csd-auth-gr/Ethos-Hate-Speech-Dataset)
- **Repository:**[ETHOS Hate Speech Dataset](https://github.com/intelligence-csd-auth-gr/Ethos-Hate-Speech-Dataset)
- **Paper:**[ETHOS: an Online Hate Speech Detection Dataset](https://arxiv.org/abs/2006.08328)
### Dataset Summary
ETHOS: onlinE haTe speecH detectiOn dataSet. This repository contains a dataset for hate speech detection on social media platforms, called Ethos. There are two variations of the dataset:
- **Ethos_Dataset_Binary**: contains 998 comments in the dataset alongside with a label about hate speech *presence* or *absence*. 565 of them do not contain hate speech, while the rest of them, 433, contain.
- **Ethos_Dataset_Multi_Label** which contains 8 labels for the 433 comments with hate speech content. These labels are *violence* (if it incites (1) or not (0) violence), *directed_vs_general* (if it is directed to a person (1) or a group (0)), and 6 labels about the category of hate speech like, *gender*, *race*, *national_origin*, *disability*, *religion* and *sexual_orientation*.
***Ethos /ˈiːθɒs/***
is a Greek word meaning “character” that is used to describe the guiding beliefs or ideals that characterize a community, nation, or ideology. The Greeks also used this word to refer to the power of music to influence emotions, behaviors, and even morals.
### Supported Tasks and Leaderboards
[More Information Needed]
- `text-classification-other-Hate Speech Detection`, `sentiment-classification`,`multi-label-classification`: The dataset can be used to train a model for hate speech detection. Moreover, it can be used as a benchmark dataset for multi label classification algorithms.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
A typical data point in the binary version comprises a comment, with a `text` containing the text and a `label` describing if a comment contains hate speech content (1 - hate-speech) or not (0 - non-hate-speech). In the multilabel version more labels like *violence* (if it incites (1) or not (0) violence), *directed_vs_general* (if it is directed to a person (1) or a group (0)), and 6 labels about the category of hate speech like, *gender*, *race*, *national_origin*, *disability*, *religion* and *sexual_orientation* are appearing.
An example from the binary version, which is offensive, but it does not contain hate speech content:
```
{'text': 'What the fuck stupid people !!!',
'label': '0'
}
```
An example from the multi-label version, which contains hate speech content towards women (gender):
```
{'text': 'You should know women's sports are a joke',
`violence`: 0,
`directed_vs_generalized`: 0,
`gender`: 1,
`race`: 0,
`national_origin`: 0,
`disability`: 0,
`religion`: 0,
`sexual_orientation`: 0
}
```
### Data Fields
Ethos Binary:
- `text`: a `string` feature containing the text of the comment.
- `label`: a classification label, with possible values including `no_hate_speech`, `hate_speech`.
Ethis Multilabel:
- `text`: a `string` feature containing the text of the comment.
- `violence`: a classification label, with possible values including `not_violent`, `violent`.
- `directed_vs_generalized`: a classification label, with possible values including `generalized`, `directed`.
- `gender`: a classification label, with possible values including `false`, `true`.
- `race`: a classification label, with possible values including `false`, `true`.
- `national_origin`: a classification label, with possible values including `false`, `true`.
- `disability`: a classification label, with possible values including `false`, `true`.
- `religion`: a classification label, with possible values including `false`, `true`.
- `sexual_orientation`: a classification label, with possible values including `false`, `true`.
### Data Splits
The data is split into binary and multilabel. Multilabel is a subset of the binary version.
| | Instances | Labels |
| ----- | ------ | ----- |
| binary | 998 | 1 |
| multilabel | 433 | 8 |
## Dataset Creation
### Curation Rationale
The dataset was build by gathering online comments in Youtube videos and reddit comments, from videos and subreddits which may attract hate speech content.
### Source Data
#### Initial Data Collection and Normalization
The initial data we used are from the hatebusters platform: [Original data used](https://intelligence.csd.auth.gr/topics/hate-speech-detection/), but they were not included in this dataset
#### Who are the source language producers?
The language producers are users of reddit and Youtube. More informations can be found in this paper: [ETHOS: an Online Hate Speech Detection Dataset](https://arxiv.org/abs/2006.08328)
### Annotations
#### Annotation process
The annotation process is detailed in the third section of this paper: [ETHOS: an Online Hate Speech Detection Dataset](https://arxiv.org/abs/2006.08328)
#### Who are the annotators?
Originally anotated by Ioannis Mollas and validated through the Figure8 platform (APEN).
### Personal and Sensitive Information
No personal and sensitive information included in the dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset will help on the evolution of the automated hate speech detection tools. Those tools have great impact on preventing social issues.
### Discussion of Biases
This dataset tries to be unbiased towards its classes and labels.
### Other Known Limitations
The dataset is relatively small and should be used combined with larger datasets.
## Additional Information
### Dataset Curators
The dataset was initially created by [Intelligent Systems Lab](https://intelligence.csd.auth.gr).
### Licensing Information
The licensing status of the datasets is [GNU GPLv3](https://choosealicense.com/licenses/gpl-3.0/).
### Citation Information
```
@misc{mollas2020ethos,
title={ETHOS: an Online Hate Speech Detection Dataset},
author={Ioannis Mollas and Zoe Chrysopoulou and Stamatis Karlos and Grigorios Tsoumakas},
year={2020},
eprint={2006.08328},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@iamollas](https://github.com/iamollas) for adding this dataset. |
mteb/msmarco | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- msmarco
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 15384091
num_examples: 532751
- name: dev
num_bytes: 217670
num_examples: 7437
- name: test
num_bytes: 270432
num_examples: 9260
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 3149969815
num_examples: 8841823
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 24100662
num_examples: 509962
configs:
- config_name: default
data_files:
- split: train
path: qrels/train.jsonl
- split: dev
path: qrels/dev.jsonl
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
--- |
emilykang/dentistry_train | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 645973737.5
num_examples: 1500
download_size: 630073075
dataset_size: 645973737.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-big_patent-y-3c6f0a-1465253965 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- big_patent
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
metrics: []
dataset_name: big_patent
dataset_config: y
dataset_split: test
col_mapping:
text: description
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
* Dataset: big_patent
* Config: y
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
rookshanks/toy_dataset | ---
dataset_info:
features:
- name: question
sequence: int64
- name: answer
sequence: int64
- name: teacher_compute_loss
sequence: int64
splits:
- name: train
num_bytes: 4416560
num_examples: 8000
- name: validation
num_bytes: 569232
num_examples: 1000
- name: test
num_bytes: 556208
num_examples: 1000
download_size: 201021
dataset_size: 5542000
---
# Dataset Card for "toy_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
suryam13/dataset1 | ---
dataset_info:
features:
- name: System - English Data
dtype: string
- name: Output - Thanglish
dtype: string
splits:
- name: train
num_bytes: 3882.4
num_examples: 8
- name: test
num_bytes: 970.6
num_examples: 2
download_size: 11908
dataset_size: 4853.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
rbel/jobtitles | ---
license: apache-2.0
---
|
seanghay/khmerfonts-info-previews | ---
license: cc-by-4.0
language:
- km
pretty_name: khmerfonts
dataset_info:
features:
- name: file_name
dtype: image
- name: text
dtype: string
---
## Dataset Info
**Total files**: 26,591 files == (`(num_fonts * num_sentences) - num_0_byte_fonts - num_error_image`)
**Total fonts**: 2,972 fonts
**Total sentences**: 10
## Dataset Creation Info
All images were downloaded from [khmerfonts.info](https://khmerfonts.info) by using a script below:
```python
with open("filelist.txt", "w") as outfile:
items = [
f"https://www.khmerfonts.info/preview.php?font={font + 1}&sample={sample + 1}\n\tout=khmerfonts-{font + 1}-{sample + 1}.png"
for font in range(2972) # maximum id at the moment
for sample in range(10)
]
outfile.write("\n".join(items))
```
Download all files using `aria2c`
```shell
aria2c -i filelist.txt -d data -j16
```
Find 0-byte files and delete
```shell
find data/ -size 0 -delete
```
```python
sentences = [
"ជាតិពាលមិនដឹងគួរ គ្មានគេសួរសោកចង់ជាក់ ឆ្លើយឆ្លងផងរាក់ទាក់ ក្បួនហិនលក្ខណ៍ធ្លាក់លើខ្លួន ។",
"ចងអ្វីមិនជាប់ស្មើសង្សារ ការអ្វីមិនស្មើការប្រតិបត្តិ ស្ងាត់អ្វីមិនស្មើចិត្តអរហត្ត កាចអ្វីមិនស្មើចិត្តពាលា ។",
"ចំណេះវិជ្ជាលោកចែងចាត់ ទុកជាសម្បត្តិសំបូរបាន ទោះបីក្រក្សត់អត់ប៉ុន្មាន គង់តែបានគ្រាន់អាស្រ័យ ។",
"ឈ្លោះគ្នាក្នុងគ្រួសារ ដូចស្រាតកាយាបង្ហាញញាតិ ឈ្លោះគ្នាក្នុងសង្គមជាតិ ដូចលាតកំណប់បង្ហាញចោរ ។",
"ជាប់ជ្រួលច្រវាក់ភក្ត្រស្រស់ស្រាយ គួរខ្លាចខ្លួនក្លាយជាក្លៀវក្លា វង្វេងផ្លូវមិនសួរនរណា តនឹងបច្ចាឥតអាវុធ ។",
"កុំគិតតែរៀនចង់ធ្វើមន្ត្រី ស្អប់ខ្ពើមភក់ដីនាំអោយក្រ ត្រូវរៀនធ្វើជាកសិករ ទើបមានទ្រព្យតទៅខាងក្រោយ ។",
"ជនណាទ្រាំអត់ ខន្តីសង្កត់ រក្សាមាយាទ មិនខឹងផ្ដេសផ្ដាស ពួកបណ្ឌិតជាតិ សរសើរជាអាទ៍ ថាអ្នកធ្ងន់ធ្ងរ ។",
"ជនពាលដល់ពេលកើតកលិយុគ ទេវតាឲ្យទុក្ខចាំផ្ដន្ទា ពួកប្រាជ្ញសប្បរសកាន់ធម្មា ដល់ពេលទុក្ខាទេវតាជួយ ។",
"ចង់ល្អហួសមាឌ ចង់បានហួសខ្នាតកំរិតមាត្រា មិនបានដូចប៉ង បំណងប្រាថ្នា ខូចទាំងទ្រព្យា គួរបានក៏បង់ ។",
"ជាតិមនុស្សពាលពោលមិនពិត កុំយកធ្វើមិត្តខាតរបស់ មនុស្សសុចរិតចិត្តសប្បុរស ស្រឡាញ់ស្មោះចិត្តឲ្យស្មើ ។",
]
```
|
1aurent/unsplash-lite | ---
dataset_info:
features:
- name: photo
struct:
- name: id
dtype: string
- name: url
dtype: string
- name: image_url
dtype: string
- name: submitted_at
dtype: string
- name: featured
dtype: bool
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: aspect_ratio
dtype: float32
- name: description
dtype: string
- name: blur_hash
dtype: string
- name: photographer
struct:
- name: username
dtype: string
- name: first_name
dtype: string
- name: last_name
dtype: string
- name: exif
struct:
- name: camera_make
dtype: string
- name: camera_model
dtype: string
- name: iso
dtype: string
- name: aperture_value
dtype: string
- name: focal_length
dtype: string
- name: exposure_time
dtype: string
- name: location
struct:
- name: name
dtype: string
- name: latitude
dtype: float32
- name: longitude
dtype: float32
- name: country
dtype: string
- name: city
dtype: string
- name: stats
struct:
- name: views
dtype: uint32
- name: downloads
dtype: uint32
- name: ai
struct:
- name: description
dtype: string
- name: primary_landmark_name
dtype: string
- name: primary_landmark_latitude
dtype: string
- name: primary_landmark_longitude
dtype: string
- name: primary_landmark_confidence
dtype: string
- name: keywords
list:
- name: keyword
dtype: string
- name: ai_service_1_confidence
dtype: string
- name: ai_service_2_confidence
dtype: string
- name: suggested_by_user
dtype: bool
- name: collections
list:
- name: collection_id
dtype: string
- name: collection_title
dtype: string
- name: photo_collected_at
dtype: string
- name: conversions
list:
- name: converted_at
dtype: string
- name: conversion_type
dtype: string
- name: keyword
dtype: string
- name: anonymous_user_id
dtype: string
- name: conversion_country
dtype: string
- name: colors
list:
- name: hex
dtype: string
- name: red
dtype: uint8
- name: green
dtype: uint8
- name: blue
dtype: uint8
- name: keyword
dtype: string
- name: ai_coverage
dtype: float32
- name: ai_score
dtype: float32
splits:
- name: train
num_bytes: 1202216966
num_examples: 25000
download_size: 618337921
dataset_size: 1202216966
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
license_name: unsplash-commercial
license_link: https://github.com/unsplash/datasets/blob/master/DOCS.md
task_categories:
- text-to-image
- image-to-text
language:
- en
tags:
- unsplash
- v1.2.1
pretty_name: Unsplash Lite
size_categories:
- 10K<n<100K
---
# The Unsplash Lite Dataset (v1.2.1)

The Lite dataset contains all of the same fields as the Full dataset, but is limited to ~25,000 photos.
It can be used for both commercial and non-commercial usage, provided you abide by [the terms](https://github.com/unsplash/datasets/blob/master/TERMS.md).
The Unsplash Dataset is made available for research purposes.
[It cannot be used to redistribute the images contained within](https://github.com/unsplash/datasets/blob/master/TERMS.md).
To use the Unsplash library in a product, see [the Unsplash API](https://unsplash.com/developers).
 |
Codec-SUPERB/librispeech_extract_unit | ---
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 5670752378
num_examples: 292367
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 5670752378
num_examples: 292367
- name: audiodec_24k_320d
num_bytes: 18134255786
num_examples: 292367
- name: academicodec_hifi_24k_320d
num_bytes: 8499202394
num_examples: 292367
download_size: 6774606005
dataset_size: 37974962936
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
---
|
HamdanXI/arb-eng-parallel-10k-splitted-cosine-5 | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 3013957
num_examples: 6489
- name: validation
num_bytes: 407437
num_examples: 1000
- name: test
num_bytes: 419389
num_examples: 1000
download_size: 2165455
dataset_size: 3840783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
hippocrates/pubmedsum | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 11428
num_examples: 1
- name: test
num_bytes: 4144995
num_examples: 200
download_size: 2086997
dataset_size: 4156423
---
# Dataset Card for "pubmedsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umarigan/youtube_scripts | ---
license: apache-2.0
task_categories:
- zero-shot-classification
- summarization
language:
- en
---
This dataset collected from various sources,
Once I obtained the urls for youtube videos, I used langchain with YoutubeLoader function to get text of videos.
Source of data: https://github.com/talesmarra/youtube_data_analysis
tasks: summarization, named entity recognition, |
Nexdata/10020_Images_of_Arabic_Natural_Scene_OCR_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
10,020 Arabic natural scenarios OCR data include a variety of natural scenarios and multiple shooting angles. In terms of labeling, the quadrilateral frames of the line -level text are transliterated by row -level text. This set of data can be used in the Arabic language country OCR mission.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1306?source=Huggingface
## Data size
10,020 images
## Collecting environment
including shop plaque, stop board, poster, ticket, road sign, comic, cover picture, prompt/reminder, warning, packing instruction, menu, building sign, magazine book covers, etc.
## Data diversity
including a variety of natural scenes, multiple shooting angles
## Device
cellphone, camera
## Photographic angle
looking up angle, looking down angle, eye-level angle
## Data format
the image data format is .jpg, the annotation file format is .json
## Annotation content
line-level quadrilateral bounding box annotation and transcription for the texts
## Accuracy
the error bound of each vertex of quadrilateral bounding box is within 5 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 97%; the texts transcription accuracy is not less than 97%
# Licensing Information
Commercial License
|
P1ot3r/cv-pl-train-whisper-tiny | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 15886403552
num_examples: 16539
download_size: 3037173064
dataset_size: 15886403552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KagglingFace/FYP-KiTS-A-Preprocessed | ---
license: mit
---
|
mikhail-panzo/raw_fleurs_fil | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: speaker_id
dtype: string
splits:
- name: train
num_bytes: 2569182963.555
num_examples: 2619
download_size: 2553278768
dataset_size: 2569182963.555
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
one-sec-cv12/chunk_226 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21978087552.0
num_examples: 228824
download_size: 18662581812
dataset_size: 21978087552.0
---
# Dataset Card for "chunk_226"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sieu-n/kollm-eval-prompt | ---
license: cc-by-nc-4.0
---
|
davanstrien/autotrain-data-abstracts | Invalid username or password. |
open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16 | ---
pretty_name: Evaluation run of TheBloke/CodeLlama-13B-Instruct-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/CodeLlama-13B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T11:46:33.264561](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16/blob/main/results_2023-10-22T11-46-33.264561.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413506,\n \"f1\": 0.05136010906040279,\n\
\ \"f1_stderr\": 0.001238131643997091,\n \"acc\": 0.4034791730120101,\n\
\ \"acc_stderr\": 0.011133121900373116\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413506,\n\
\ \"f1\": 0.05136010906040279,\n \"f1_stderr\": 0.001238131643997091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12661106899166036,\n \
\ \"acc_stderr\": 0.009159715283081094\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6803472770323599,\n \"acc_stderr\": 0.013106528517665136\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|arc:challenge|25_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T11_46_33.264561
path:
- '**/details_harness|drop|3_2023-10-22T11-46-33.264561.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T11-46-33.264561.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T11_46_33.264561
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-46-33.264561.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-46-33.264561.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hellaswag|10_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T11_46_33.264561
path:
- '**/details_harness|winogrande|5_2023-10-22T11-46-33.264561.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T11-46-33.264561.parquet'
- config_name: results
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- results_2023-08-25T23:11:55.664382.parquet
- split: 2023_10_22T11_46_33.264561
path:
- results_2023-10-22T11-46-33.264561.parquet
- split: latest
path:
- results_2023-10-22T11-46-33.264561.parquet
---
# Dataset Card for Evaluation run of TheBloke/CodeLlama-13B-Instruct-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/CodeLlama-13B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T11:46:33.264561](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16/blob/main/results_2023-10-22T11-46-33.264561.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413506,
"f1": 0.05136010906040279,
"f1_stderr": 0.001238131643997091,
"acc": 0.4034791730120101,
"acc_stderr": 0.011133121900373116
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413506,
"f1": 0.05136010906040279,
"f1_stderr": 0.001238131643997091
},
"harness|gsm8k|5": {
"acc": 0.12661106899166036,
"acc_stderr": 0.009159715283081094
},
"harness|winogrande|5": {
"acc": 0.6803472770323599,
"acc_stderr": 0.013106528517665136
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FVilmar/lombardi | ---
license: openrail
---
|
EuRoxxx/Gielavocal | ---
license: openrail
---
|
polejowska/lcbsi-wbc-ap | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': basophil
'1': eosinophil
'2': lymphocyte
'3': monocyte
'4': neutrophil
splits:
- name: train
num_bytes: 25369707.0
num_examples: 3500
- name: test
num_bytes: 5540002.0
num_examples: 750
- name: valid
num_bytes: 5488683.0
num_examples: 750
download_size: 36231350
dataset_size: 36398392.0
---
# Dataset Card for "lcbsi-wbc-ap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-college_mathematics-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 3554
num_examples: 5
download_size: 0
dataset_size: 3554
---
# Dataset Card for "mmlu-college_mathematics-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/april_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of april/エイプリル/四月 (Arknights)
This is the dataset of april/エイプリル/四月 (Arknights), containing 105 images and their tags.
The core tags of this character are `animal_ears, rabbit_ears, long_hair, brown_hair, breasts, purple_eyes, hair_between_eyes, headphones, medium_breasts, black_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 105 | 196.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/april_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 105 | 165.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/april_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 291 | 337.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/april_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/april_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_dress, long_sleeves, simple_background, smile, :p, black_gloves, single_glove, cowboy_shot, infection_monitor_(arknights), white_background, fingerless_gloves, implied_extra_ears, very_long_hair, asymmetrical_sleeves |
| 1 | 6 |  |  |  |  |  | 1girl, arrow_(projectile), short_sleeves, solo, white_dress, holding_bow_(weapon), infection_monitor_(arknights), simple_background, smile, white_socks, black_gloves, full_body, looking_at_viewer, quiver, tongue_out, fingerless_gloves, implied_extra_ears, rabbit_girl, single_glove, sneakers, very_long_hair, white_background, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_dress | long_sleeves | simple_background | smile | :p | black_gloves | single_glove | cowboy_shot | infection_monitor_(arknights) | white_background | fingerless_gloves | implied_extra_ears | very_long_hair | asymmetrical_sleeves | arrow_(projectile) | short_sleeves | holding_bow_(weapon) | white_socks | full_body | quiver | tongue_out | rabbit_girl | sneakers | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:---------------|:--------------------|:--------|:-----|:---------------|:---------------|:--------------|:--------------------------------|:-------------------|:--------------------|:---------------------|:-----------------|:-----------------------|:---------------------|:----------------|:-----------------------|:--------------|:------------|:---------|:-------------|:--------------|:-----------|:-----------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | X | X | | X | X | | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X |
|
gokuls/wiki_book_corpus_complete_processed_bert_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 22201610400.0
num_examples: 6167114
download_size: 2763194793
dataset_size: 22201610400.0
---
# Dataset Card for "wiki_book_corpus_complete_processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atila2251/rafalucas | ---
license: openrail
---
|
Dahoas/pretrained-unet-cifar10-32 | ---
dataset_info:
features:
- name: images
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 634400000
num_examples: 50000
download_size: 646502859
dataset_size: 634400000
---
# Dataset Card for "pretrained-unet-cifar10-32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ticoAg/Chinese-medical-dialogue | ---
license: apache-2.0
raw csv: 356 MB
examples: 799743
---
# Note
process data from [Chinese-medical-dialogue-data](https://github.com/Toyhom/Chinese-medical-dialogue-data)
单轮医患对话
## raw data samples
|department|title|ask|answer|
|----------|-----|---|------|
|心血管科|高血压患者能吃党参吗?|我有高血压这两天女婿来的时候给我拿了些党参泡水喝,您好高血压可以吃党参吗?|高血压病人可以口服党参的。党参有降血脂,降血压的作用,可以彻底消除血液中的垃圾,从而对冠心病以及心血管疾病的患者都有一定的稳定预防工作作用,因此平时口服党参能远离三高的危害。另外党参除了益气养血,降低中枢神经作用,调整消化系统功能,健脾补肺的功能。感谢您的进行咨询,期望我的解释对你有所帮助。|
|内分泌科|糖尿病还会进行遗传吗?|糖尿病有隔代遗传吗?我妈是糖尿病,很多年了,也没养好,我现在也是,我妹子也是,我儿子现在二十岁,没什么问题,但是以后会不会也得糖尿病啊,真是难过,我现在就已经开始让他控制点吃东西。|2型糖尿病的隔代遗传概率为父母患糖尿病,临产的发生率为40%,比一般人患糖尿病,疾病,如何更重要的选择因素基于生活方式的,后天也隔代遗传隔代遗传易感性更公正,增强患糖尿病的风险,低糖低脂肪,平时清淡饮食,适当锻练,增强监测数据,血糖仪买个备取。|
|内分泌科|糖尿病会出现什么症状?|我是不是糖尿病,如何严重,糖尿病的典型症状有哪些?血糖高之后感觉什么东西都不能够吃了,有糖分的东西都不敢吃,怕血糖又高,不知晓是不是变严重了,糖尿病的症状有哪些?|你好,根据你描述的情况看来糖尿病是可以致使血糖异常下降的,可以再次出现三多一少的症状,如喝水多,小便多,饭量大,体重减轻,建议你尽快复诊当地医院内分泌科看一看,需要有让大夫仔细检查你的血糖水平,明确有否糖尿病的情况,及时动用降糖药治疗,平时一定少吃甜食,足量锻练。|
## processed data sample
```json
[
{"instruction":"title", "input":"ask", "output":"answer", "history":None},
]
``` |
DTU54DL/commonvoice10k | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
paperswithcode_id: acronym-identification
pretty_name: Acronym Identification Dataset
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- token-classification-other-acronym-identification
train-eval-index:
- col_mapping:
labels: tags
tokens: tokens
config: default
splits:
eval_split: test
task: token-classification
task_id: entity_extraction
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/18c4ecbf | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1330
dataset_size: 180
---
# Dataset Card for "18c4ecbf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rcmt/splat-samples | ---
license: mit
---
|
Aoschu/German_invoices_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
- name: bbox
dtype: string
- name: transcription
dtype: string
- name: annotator
dtype: float64
- name: annotation_id
dtype: float64
splits:
- name: train
num_bytes: 14866686.0
num_examples: 97
download_size: 10032440
dataset_size: 14866686.0
---
# Dataset Card for "German_invoices_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tartuNLP/finno-ugric-benchmark | ---
license: cc-by-4.0
---
|
AdapterOcean/python3-standardized_cluster_17_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8122221
num_examples: 5456
download_size: 1402681
dataset_size: 8122221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_17_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SaffalPoosh/HR-test-VITON | ---
dataset_info:
features:
- name: agnostic-v3.2
dtype: image
- name: cloth-mask
dtype: image
- name: image-densepose
dtype: image
- name: image-parse-v3
dtype: image
- name: openpose_json
dtype: string
- name: cloth
dtype: image
- name: image
dtype: image
- name: image-parse-agnostic-v3.2
dtype: image
- name: openpose_img
dtype: image
splits:
- name: train
num_bytes: 782683174.576
num_examples: 2032
download_size: 721576589
dataset_size: 782683174.576
---
# Dataset Card for "HR-test-VITON"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/tweets2013-ia_trec-mb-2014 | ---
pretty_name: '`tweets2013-ia/trec-mb-2014`'
viewer: false
source_datasets: ['irds/tweets2013-ia']
task_categories:
- text-retrieval
---
# Dataset Card for `tweets2013-ia/trec-mb-2014`
The `tweets2013-ia/trec-mb-2014` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/tweets2013-ia#tweets2013-ia/trec-mb-2014).
# Data
This dataset provides:
- `queries` (i.e., topics); count=55
- `qrels`: (relevance assessments); count=57,985
- For `docs`, use [`irds/tweets2013-ia`](https://huggingface.co/datasets/irds/tweets2013-ia)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/tweets2013-ia_trec-mb-2014', 'queries')
for record in queries:
record # {'query_id': ..., 'query': ..., 'time': ..., 'tweet_time': ..., 'description': ...}
qrels = load_dataset('irds/tweets2013-ia_trec-mb-2014', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Lin2014Microblog,
title={Overview of the TREC-2014 Microblog Track},
author={Jimmy Lin and Miles Efron and Yulu Wang and Garrick Sherman},
booktitle={TREC},
year={2014}
}
@inproceedings{Sequiera2017TweetsIA,
title={Finally, a Downloadable Test Collection of Tweets},
author={Royal Sequiera and Jimmy Lin},
booktitle={SIGIR},
year={2017}
}
```
|
FloatAI/humaneval-xl | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: language
dtype: string
- name: prompt
dtype: string
- name: test
dtype: string
- name: entry_point
dtype: string
- name: canonical_solution
dtype: string
- name: natural_language
dtype: string
splits:
- name: python
num_examples: 80
license: apache-2.0
task_categories:
- text-generation
tags:
- code-generation
language:
- en
- zh
- ru
- de
- es
- fr
- it
- pt
- el
- hu
- nl
- fi
- id
- tr
- ar
- vi
- bg
- fa
- ms
- he
- et
- tl
- af
---
# [HumanEval-XL: An Execution-based Multilingual Code Generation Benchmark Across 23 Natural Languages and 12 Programming Languages](https://arxiv.org/abs/2402.16694)
<a href="https://arxiv.org/abs/2402.16694" target="_blank">
<img alt="LREC-COLING 2024" src="https://img.shields.io/badge/Proceedings-COLING 2024-red" />
</a>
This repository contains data and evaluation code for the paper "[HumanEval-XL: A Multilingual Code Generation Benchmark for Cross-lingual Natural Language Generalization](https://arxiv.org/pdf/2402.16694)".
## Dataset Summary
We introduce HumanEval-XL, a massively multilingual code generation benchmark specifically crafted to address this deficiency. HumanEval-XL establishes connections between 23 NLs and 12 programming languages (PLs), and comprises of a collection of 22,080 prompts with an average of 8.33 test cases. By ensuring parallel data across multiple NLs and PLs, HumanEval-XL offers a comprehensive evaluation platform for multilingual LLMs, allowing the assessment of the understanding of different NLs. Our work serves as a pioneering step towards filling the void in evaluating NL generalization in the area of multilingual code generation. We make our evaluation code and data publicly available at https://github.com/FloatAI/HumanEval-XL.
## Dataset Structure
We have in total:
**12 PLs** are: "python", "java", "javascript", "csharp", "go", "kotlin", "perl", "php", "ruby", "scala", "swift", "typescript"
**23 NLs** are: "English", "Russian", "Chinese", "German", "Spanish", "French", "Italian", "Portuguese", "Greek", "Hungarian", "Dutch", "Finnish", "Indonesian", "Turkish", "Arabic", "Vietnamese", "Bulgarian", "Persian", "Malay", "Hebrew", "Estonian", "Tagalog", "Afrikaans"
To load a specific dataset and language
```python
from datasets import load_dataset
dataset = load_dataset("FloatAI/HumanEval-XL", "python")
DatasetDict({
English: Dataset({
features: ['task_id', 'language', 'prompt', 'description', 'test', 'entry_point', 'canonical_solution', 'natural_language'],
num_rows: 80
})
Russian: Dataset({
features: ['task_id', 'language', 'prompt', 'description', 'test', 'entry_point', 'canonical_solution', 'natural_language'],
num_rows: 80
})
Chinese: Dataset({
features: ['task_id', 'language', 'prompt', 'description', 'test', 'entry_point', 'canonical_solution', 'natural_language'],
num_rows: 80
})
⋮
Afrikaans: Dataset({
features: ['task_id', 'language', 'prompt', 'description', 'test', 'entry_point', 'canonical_solution', 'natural_language'],
num_rows: 80
})
})
```
### Data Instances
An example of a dataset instance (In python split with Chinese prompts - dataset["Chinese"][0]):
```python
{
'task_id': 'python/0',
'language': 'python',
'prompt': 'from typing import List\n\n\ndef below_zero(operations: List[int]) -> bool:\n """ 你会得到一个银行账户的存款和取款操作列表,该账户从零余额开始。你的任务是检测账户余额是否在任何时候降至零以下,并在该点返回True。否则应返回False。\n \n >>> below_zero([1, 2, 3])\n False\n >>> below_zero([1, 2, -4, 5])\n True\n """\n',
'description': '你会得到一个银行账户的存款和取款操作列表,该账户从零余额开始。你的任务是检测账户余额是否在任何时候降至零以下,并在该点返回True。否则应返回False。\n ',
'test': "\n\nMETADATA = {\n 'author': 'jt',\n 'dataset': 'test'\n}\n\n\ndef check(candidate):\n assert candidate([]) == False\n assert candidate([1, 2, -3, 1, 2, -3]) == False\n assert candidate([1, 2, -4, 5, 6]) == True\n assert candidate([1, -1, 2, -2, 5, -5, 4, -4]) == False\n assert candidate([1, -1, 2, -2, 5, -5, 4, -5]) == True\n assert candidate([1, -2, 2, -2, 5, -5, 4, -4]) == True\n",
'entry_point': 'below_zero',
'canonical_solution': ' balance = 0\n\n for op in operations:\n balance += op\n if balance < 0:\n return True\n\n return False\n',
'natural_language': 'Chinese'
}
```
### Data Fields
- `task_id`: identifier for the data sample
- `prompt`: input for the model containing function header and docstrings
- `canonical_solution`: solution for the problem in the `prompt`
- `description`: task description
- `test`: contains function to test generated code for correctness
- `entry_point`: entry point for test
- `language`: programming lanuage identifier to call the appropriate subprocess call for program execution
- `natural_language`: natural language identifier to show the language the prompt is in
### Data Splits
programming languages are used to speicify splits:
- python
- java
- javascript
- csharp
- go
- kotlin
- php
- perl
- ruby
- swift
- scala
- typescript
## Citation
For attribution in academic contexts, please cite this work as:
```
@article{peng2024humaneval,
title={HumanEval-XL: A Multilingual Code Generation Benchmark for Cross-lingual Natural Language Generalization},
author={Peng, Qiwei and Chai, Yekun and Li, Xuhong},
journal={arXiv preprint arXiv:2402.16694},
year={2024}
}
``` |
karo2w/rd | ---
license: apache-2.0
---
|
EleutherAI/quirky_addition_increment0_alice | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 12620694.0
num_examples: 192000
- name: validation
num_bytes: 263159.0
num_examples: 4000
- name: test
num_bytes: 263034.0
num_examples: 4000
download_size: 4075447
dataset_size: 13146887.0
---
# Dataset Card for "quirky_addition_increment0_alice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
keirp/tiny_math | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 134067703
num_examples: 17063
download_size: 60407086
dataset_size: 134067703
---
# Dataset Card for "tiny_math"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Outside/prova | ---
license: other
---
|
dchatca/vn-economic-articles-summary-remove | ---
dataset_info:
features:
- name: Title
dtype: string
- name: Content
dtype: string
- name: Sum-Content
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 12556382.253918495
num_examples: 1148
- name: test
num_bytes: 1400014.7460815047
num_examples: 128
download_size: 6396736
dataset_size: 13956397.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
YangyiYY/LVLM_NLF | ---
task_categories:
- conversational
- text-generation
language:
- en
pretty_name: LVLM_NLF
size_categories:
- 10K<n<100K
---
NOTE: LVLM_NLF and VLSafe are constructed based on COCO and LLaVA. So the image can be directly retrieved from the COCO train-2017 version using the image id.
LVLM_NLF (Large Vision Language Model with Natural Language Feedback) Dataset Card
Dataset details
Dataset type: LVLM_NLF is a GPT-4-Annotated natural language feedback dataset that aims to improve the 3H alignment and interaction ability of large vision-language models (LVLMs).
Dataset date: LVLM_NLF was collected between September and November 2023.
Paper of this dataset: https://arxiv.org/abs/2311.10081
VLSafe (vision-language safety) Dataset Card
We also create and release VLSafe dataset, which contains training and testing sets for improving and examining the harmlessness alignment of LVLMs.
Dataset type: VLSafe is a GPT-3.5-Turbo-Annotated dataset.
Dataset date: LVLM_NLF was collected between September and October 2023.
|
open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test | ---
pretty_name: Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T17:23:23.398012](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test/blob/main/results_2024-02-04T17-23-23.398012.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2709585369031208,\n\
\ \"acc_stderr\": 0.03132168435923893,\n \"acc_norm\": 0.2720079653463839,\n\
\ \"acc_norm_stderr\": 0.032088283521883774,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123899,\n \"mc2\": 0.3558941030383168,\n\
\ \"mc2_stderr\": 0.013905917466770197\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29436860068259385,\n \"acc_stderr\": 0.013318528460539419,\n\
\ \"acc_norm\": 0.3250853242320819,\n \"acc_norm_stderr\": 0.013688147309729124\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4245170284803824,\n\
\ \"acc_stderr\": 0.004932593348813624,\n \"acc_norm\": 0.5584544911372237,\n\
\ \"acc_norm_stderr\": 0.004955564650016174\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724067,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02924188386962881,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02924188386962881\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29533678756476683,\n \"acc_stderr\": 0.03292296639155139,\n\
\ \"acc_norm\": 0.29533678756476683,\n \"acc_norm_stderr\": 0.03292296639155139\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978086,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978086\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083289,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083289\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.3811659192825112,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094635,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2094017094017094,\n\
\ \"acc_stderr\": 0.026655699653922747,\n \"acc_norm\": 0.2094017094017094,\n\
\ \"acc_norm_stderr\": 0.026655699653922747\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2681992337164751,\n\
\ \"acc_stderr\": 0.01584243083526944,\n \"acc_norm\": 0.2681992337164751,\n\
\ \"acc_norm_stderr\": 0.01584243083526944\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321628,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468648,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468648\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087866,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087866\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.025583062489984838,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.025583062489984838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307706,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307706\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23142112125162972,\n\
\ \"acc_stderr\": 0.01077146171157646,\n \"acc_norm\": 0.23142112125162972,\n\
\ \"acc_norm_stderr\": 0.01077146171157646\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02892058322067561,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02892058322067561\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.03610805018031023,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.03610805018031023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123899,\n \"mc2\": 0.3558941030383168,\n\
\ \"mc2_stderr\": 0.013905917466770197\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6211523283346487,\n \"acc_stderr\": 0.013633724603180318\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02350265352539803,\n \
\ \"acc_stderr\": 0.004172883669643982\n }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|arc:challenge|25_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|gsm8k|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hellaswag|10_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T17-23-23.398012.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- '**/details_harness|winogrande|5_2024-02-04T17-23-23.398012.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T17-23-23.398012.parquet'
- config_name: results
data_files:
- split: 2024_02_04T17_23_23.398012
path:
- results_2024-02-04T17-23-23.398012.parquet
- split: latest
path:
- results_2024-02-04T17-23-23.398012.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T17:23:23.398012](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test/blob/main/results_2024-02-04T17-23-23.398012.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2709585369031208,
"acc_stderr": 0.03132168435923893,
"acc_norm": 0.2720079653463839,
"acc_norm_stderr": 0.032088283521883774,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3558941030383168,
"mc2_stderr": 0.013905917466770197
},
"harness|arc:challenge|25": {
"acc": 0.29436860068259385,
"acc_stderr": 0.013318528460539419,
"acc_norm": 0.3250853242320819,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.4245170284803824,
"acc_stderr": 0.004932593348813624,
"acc_norm": 0.5584544911372237,
"acc_norm_stderr": 0.004955564650016174
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724067,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198823,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198823
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02924188386962881,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02924188386962881
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276862,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276862
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29533678756476683,
"acc_stderr": 0.03292296639155139,
"acc_norm": 0.29533678756476683,
"acc_norm_stderr": 0.03292296639155139
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083289,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083289
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094635,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2094017094017094,
"acc_stderr": 0.026655699653922747,
"acc_norm": 0.2094017094017094,
"acc_norm_stderr": 0.026655699653922747
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2681992337164751,
"acc_stderr": 0.01584243083526944,
"acc_norm": 0.2681992337164751,
"acc_norm_stderr": 0.01584243083526944
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468648,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468648
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.025583062489984838,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.025583062489984838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307706,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23142112125162972,
"acc_stderr": 0.01077146171157646,
"acc_norm": 0.23142112125162972,
"acc_norm_stderr": 0.01077146171157646
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02892058322067561,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02892058322067561
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031023,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3558941030383168,
"mc2_stderr": 0.013905917466770197
},
"harness|winogrande|5": {
"acc": 0.6211523283346487,
"acc_stderr": 0.013633724603180318
},
"harness|gsm8k|5": {
"acc": 0.02350265352539803,
"acc_stderr": 0.004172883669643982
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HuggingKG/bitirme-ds-mini2 | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Context
dtype: string
- name: Question Length
dtype: int64
- name: Context Length
dtype: int64
- name: answer_start
dtype: int64
- name: answer_end
dtype: int64
splits:
- name: train
num_bytes: 203172
num_examples: 748
- name: validation
num_bytes: 22568
num_examples: 84
download_size: 68152
dataset_size: 225740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4 | ---
pretty_name: Evaluation run of NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4](https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T08:18:36.826328](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4/blob/main/results_2023-09-23T08-18-36.826328.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02160234899328859,\n\
\ \"em_stderr\": 0.0014888393578850604,\n \"f1\": 0.07773175335570466,\n\
\ \"f1_stderr\": 0.0019038640159988432,\n \"acc\": 0.4092104767130632,\n\
\ \"acc_stderr\": 0.009856677593330436\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02160234899328859,\n \"em_stderr\": 0.0014888393578850604,\n\
\ \"f1\": 0.07773175335570466,\n \"f1_stderr\": 0.0019038640159988432\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \
\ \"acc_stderr\": 0.0073906544811082045\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T08_18_36.826328
path:
- '**/details_harness|drop|3_2023-09-23T08-18-36.826328.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T08-18-36.826328.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T08_18_36.826328
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-18-36.826328.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-18-36.826328.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:03:57.703003.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:03:57.703003.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T08_18_36.826328
path:
- '**/details_harness|winogrande|5_2023-09-23T08-18-36.826328.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T08-18-36.826328.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_03_57.703003
path:
- results_2023-09-02T17:03:57.703003.parquet
- split: 2023_09_23T08_18_36.826328
path:
- results_2023-09-23T08-18-36.826328.parquet
- split: latest
path:
- results_2023-09-23T08-18-36.826328.parquet
---
# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4](https://huggingface.co/NobodyExistsOnTheInternet/GiftedConvo13bLoraNoEconsE4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T08:18:36.826328](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__GiftedConvo13bLoraNoEconsE4/blob/main/results_2023-09-23T08-18-36.826328.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02160234899328859,
"em_stderr": 0.0014888393578850604,
"f1": 0.07773175335570466,
"f1_stderr": 0.0019038640159988432,
"acc": 0.4092104767130632,
"acc_stderr": 0.009856677593330436
},
"harness|drop|3": {
"em": 0.02160234899328859,
"em_stderr": 0.0014888393578850604,
"f1": 0.07773175335570466,
"f1_stderr": 0.0019038640159988432
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.0073906544811082045
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_cola_uninflect | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 10143
num_examples: 138
- name: test
num_bytes: 11291
num_examples: 156
- name: train
num_bytes: 87017
num_examples: 1208
download_size: 54983
dataset_size: 108451
---
# Dataset Card for "MULTI_VALUE_cola_uninflect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
toughdata/quora-question-answer-dataset | ---
license: gpl-3.0
task_categories:
- question-answering
- conversational
- text2text-generation
language:
- en
tags:
- question
- answer
- quora
pretty_name: Quora Question/Answer Pairs
---
Quora Question Answer Dataset (Quora-QuAD) contains 56,402 question-answer pairs scraped from Quora.
# Usage:
For instructions on fine-tuning a model (Flan-T5) with this dataset, please check out the article: https://www.toughdata.net/blog/post/finetune-flan-t5-question-answer-quora-dataset |
CyberHarem/sana_kuranaka_onichichi | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sana Kuranaka
This is the dataset of Sana Kuranaka, containing 134 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 134 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 293 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 353 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 134 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 134 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 134 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 293 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 293 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 213 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 353 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 353 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
alvations/stash | ---
license: cc0-1.0
---
|
felipesampaio2010/jennhi5usa | ---
license: openrail
---
|
AdapterOcean/code_instructions_standardized_cluster_11 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 81708985
num_examples: 7091
download_size: 26582930
dataset_size: 81708985
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chatv0.1 | ---
pretty_name: Evaluation run of giraffe176/WestLake_Noromaid_OpenHermes_neural-chatv0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [giraffe176/WestLake_Noromaid_OpenHermes_neural-chatv0.1](https://huggingface.co/giraffe176/WestLake_Noromaid_OpenHermes_neural-chatv0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chatv0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T20:27:12.874209](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chatv0.1/blob/main/results_2024-03-02T20-27-12.874209.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6491761599133914,\n\
\ \"acc_stderr\": 0.03209121942325631,\n \"acc_norm\": 0.6505386066862624,\n\
\ \"acc_norm_stderr\": 0.032738800781680906,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5149650221658989,\n\
\ \"mc2_stderr\": 0.015183750756923416\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131167,\n\
\ \"acc_norm\": 0.6672354948805461,\n \"acc_norm_stderr\": 0.013769863046192307\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.662617008564031,\n\
\ \"acc_stderr\": 0.0047185047710837655,\n \"acc_norm\": 0.8537143995220076,\n\
\ \"acc_norm_stderr\": 0.003526700741879445\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n\
\ \"acc_stderr\": 0.01585200244986211,\n \"acc_norm\": 0.3407821229050279,\n\
\ \"acc_norm_stderr\": 0.01585200244986211\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"\
acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.012729785386598563,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.012729785386598563\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5149650221658989,\n\
\ \"mc2_stderr\": 0.015183750756923416\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936655\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6520090978013646,\n \
\ \"acc_stderr\": 0.013120581030382132\n }\n}\n```"
repo_url: https://huggingface.co/giraffe176/WestLake_Noromaid_OpenHermes_neural-chatv0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|arc:challenge|25_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|gsm8k|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hellaswag|10_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-27-12.874209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T20-27-12.874209.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- '**/details_harness|winogrande|5_2024-03-02T20-27-12.874209.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T20-27-12.874209.parquet'
- config_name: results
data_files:
- split: 2024_03_02T20_27_12.874209
path:
- results_2024-03-02T20-27-12.874209.parquet
- split: latest
path:
- results_2024-03-02T20-27-12.874209.parquet
---
# Dataset Card for Evaluation run of giraffe176/WestLake_Noromaid_OpenHermes_neural-chatv0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/WestLake_Noromaid_OpenHermes_neural-chatv0.1](https://huggingface.co/giraffe176/WestLake_Noromaid_OpenHermes_neural-chatv0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chatv0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T20:27:12.874209](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chatv0.1/blob/main/results_2024-03-02T20-27-12.874209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6491761599133914,
"acc_stderr": 0.03209121942325631,
"acc_norm": 0.6505386066862624,
"acc_norm_stderr": 0.032738800781680906,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5149650221658989,
"mc2_stderr": 0.015183750756923416
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.6672354948805461,
"acc_norm_stderr": 0.013769863046192307
},
"harness|hellaswag|10": {
"acc": 0.662617008564031,
"acc_stderr": 0.0047185047710837655,
"acc_norm": 0.8537143995220076,
"acc_norm_stderr": 0.003526700741879445
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634335,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634335
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092448,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092448
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.01585200244986211,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.01585200244986211
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598563,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598563
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018515,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018515
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5149650221658989,
"mc2_stderr": 0.015183750756923416
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936655
},
"harness|gsm8k|5": {
"acc": 0.6520090978013646,
"acc_stderr": 0.013120581030382132
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1 | ---
pretty_name: Evaluation run of TomGrc/FusionNet_7Bx2_MoE_v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TomGrc/FusionNet_7Bx2_MoE_v0.1](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T23:18:59.818321](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1/blob/main/results_2024-02-01T23-18-59.818321.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6567476767898744,\n\
\ \"acc_stderr\": 0.0319983885224105,\n \"acc_norm\": 0.6555568221631944,\n\
\ \"acc_norm_stderr\": 0.03268292282008458,\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7120018677798674,\n\
\ \"mc2_stderr\": 0.014772831374257856\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.01320319608853737,\n\
\ \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927104\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7210714997012547,\n\
\ \"acc_stderr\": 0.004475557360359705,\n \"acc_norm\": 0.8889663413662617,\n\
\ \"acc_norm_stderr\": 0.0031353173122281226\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n\
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"\
acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653344,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653344\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7120018677798674,\n\
\ \"mc2_stderr\": 0.014772831374257856\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8752959747434885,\n \"acc_stderr\": 0.009285404952684428\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624179\n }\n}\n```"
repo_url: https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|arc:challenge|25_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|gsm8k|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hellaswag|10_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T23-18-59.818321.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- '**/details_harness|winogrande|5_2024-02-01T23-18-59.818321.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T23-18-59.818321.parquet'
- config_name: results
data_files:
- split: 2024_02_01T23_18_59.818321
path:
- results_2024-02-01T23-18-59.818321.parquet
- split: latest
path:
- results_2024-02-01T23-18-59.818321.parquet
---
# Dataset Card for Evaluation run of TomGrc/FusionNet_7Bx2_MoE_v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TomGrc/FusionNet_7Bx2_MoE_v0.1](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T23:18:59.818321](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1/blob/main/results_2024-02-01T23-18-59.818321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6567476767898744,
"acc_stderr": 0.0319983885224105,
"acc_norm": 0.6555568221631944,
"acc_norm_stderr": 0.03268292282008458,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7120018677798674,
"mc2_stderr": 0.014772831374257856
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.01320319608853737,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927104
},
"harness|hellaswag|10": {
"acc": 0.7210714997012547,
"acc_stderr": 0.004475557360359705,
"acc_norm": 0.8889663413662617,
"acc_norm_stderr": 0.0031353173122281226
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653344,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653344
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7120018677798674,
"mc2_stderr": 0.014772831374257856
},
"harness|winogrande|5": {
"acc": 0.8752959747434885,
"acc_stderr": 0.009285404952684428
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624179
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v2 | ---
pretty_name: Evaluation run of luffycodes/llama-shishya-7b-ep3-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/llama-shishya-7b-ep3-v2](https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T12:57:06.707192](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v2_public/blob/main/results_2023-11-09T12-57-06.707192.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43776292457137356,\n\
\ \"acc_stderr\": 0.03405236312139111,\n \"acc_norm\": 0.44440566326787106,\n\
\ \"acc_norm_stderr\": 0.03497626520397757,\n \"mc1\": 0.19583843329253367,\n\
\ \"mc1_stderr\": 0.01389234436774209,\n \"mc2\": 0.3016304809342682,\n\
\ \"mc2_stderr\": 0.013699598037265183,\n \"em\": 0.30557885906040266,\n\
\ \"em_stderr\": 0.004717509363446725,\n \"f1\": 0.36205327181208175,\n\
\ \"f1_stderr\": 0.004656030495449622\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.44197952218430037,\n \"acc_stderr\": 0.014512682523128345,\n\
\ \"acc_norm\": 0.4735494880546075,\n \"acc_norm_stderr\": 0.014590931358120172\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5865365465046803,\n\
\ \"acc_stderr\": 0.004914480534533716,\n \"acc_norm\": 0.7588129854610636,\n\
\ \"acc_norm_stderr\": 0.004269291950109927\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.040089737857792046,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.040089737857792046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3931034482758621,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.3931034482758621,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.5096774193548387,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\"\
: 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.5854922279792746,\n \"acc_stderr\": 0.035553003195576686,\n\
\ \"acc_norm\": 0.5854922279792746,\n \"acc_norm_stderr\": 0.035553003195576686\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37435897435897436,\n \"acc_stderr\": 0.024537591572830517,\n\
\ \"acc_norm\": 0.37435897435897436,\n \"acc_norm_stderr\": 0.024537591572830517\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236153,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236153\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5944954128440367,\n \"acc_stderr\": 0.021050997991896834,\n \"\
acc_norm\": 0.5944954128440367,\n \"acc_norm_stderr\": 0.021050997991896834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859672,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859672\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6835443037974683,\n \"acc_stderr\": 0.03027497488021898,\n \
\ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.03027497488021898\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.039015918258361836,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.039015918258361836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.03023638994217308,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.03023638994217308\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6590038314176245,\n\
\ \"acc_stderr\": 0.016951781383223313,\n \"acc_norm\": 0.6590038314176245,\n\
\ \"acc_norm_stderr\": 0.016951781383223313\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364546,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364546\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n\
\ \"acc_stderr\": 0.028320325830105908,\n \"acc_norm\": 0.5369774919614148,\n\
\ \"acc_norm_stderr\": 0.028320325830105908\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.02780749004427621,\n\
\ \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.02780749004427621\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3044328552803129,\n\
\ \"acc_stderr\": 0.01175287759259757,\n \"acc_norm\": 0.3044328552803129,\n\
\ \"acc_norm_stderr\": 0.01175287759259757\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4199346405228758,\n \"acc_stderr\": 0.019966811178256483,\n \
\ \"acc_norm\": 0.4199346405228758,\n \"acc_norm_stderr\": 0.019966811178256483\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.03531987930208731,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.03531987930208731\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.03599335771456027,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.03599335771456027\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.19583843329253367,\n\
\ \"mc1_stderr\": 0.01389234436774209,\n \"mc2\": 0.3016304809342682,\n\
\ \"mc2_stderr\": 0.013699598037265183\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6874506708760852,\n \"acc_stderr\": 0.013027563620748835\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.30557885906040266,\n \
\ \"em_stderr\": 0.004717509363446725,\n \"f1\": 0.36205327181208175,\n\
\ \"f1_stderr\": 0.004656030495449622\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|arc:challenge|25_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|drop|3_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|gsm8k|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hellaswag|10_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-57-06.707192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T12-57-06.707192.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- '**/details_harness|winogrande|5_2023-11-09T12-57-06.707192.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T12-57-06.707192.parquet'
- config_name: results
data_files:
- split: 2023_11_09T12_57_06.707192
path:
- results_2023-11-09T12-57-06.707192.parquet
- split: latest
path:
- results_2023-11-09T12-57-06.707192.parquet
---
# Dataset Card for Evaluation run of luffycodes/llama-shishya-7b-ep3-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [luffycodes/llama-shishya-7b-ep3-v2](https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T12:57:06.707192](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v2_public/blob/main/results_2023-11-09T12-57-06.707192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43776292457137356,
"acc_stderr": 0.03405236312139111,
"acc_norm": 0.44440566326787106,
"acc_norm_stderr": 0.03497626520397757,
"mc1": 0.19583843329253367,
"mc1_stderr": 0.01389234436774209,
"mc2": 0.3016304809342682,
"mc2_stderr": 0.013699598037265183,
"em": 0.30557885906040266,
"em_stderr": 0.004717509363446725,
"f1": 0.36205327181208175,
"f1_stderr": 0.004656030495449622
},
"harness|arc:challenge|25": {
"acc": 0.44197952218430037,
"acc_stderr": 0.014512682523128345,
"acc_norm": 0.4735494880546075,
"acc_norm_stderr": 0.014590931358120172
},
"harness|hellaswag|10": {
"acc": 0.5865365465046803,
"acc_stderr": 0.004914480534533716,
"acc_norm": 0.7588129854610636,
"acc_norm_stderr": 0.004269291950109927
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.040089737857792046,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.040089737857792046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3931034482758621,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.3931034482758621,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.51010101010101,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.51010101010101,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5854922279792746,
"acc_stderr": 0.035553003195576686,
"acc_norm": 0.5854922279792746,
"acc_norm_stderr": 0.035553003195576686
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37435897435897436,
"acc_stderr": 0.024537591572830517,
"acc_norm": 0.37435897435897436,
"acc_norm_stderr": 0.024537591572830517
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5944954128440367,
"acc_stderr": 0.021050997991896834,
"acc_norm": 0.5944954128440367,
"acc_norm_stderr": 0.021050997991896834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859672,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859672
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.03023638994217308,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.03023638994217308
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6590038314176245,
"acc_stderr": 0.016951781383223313,
"acc_norm": 0.6590038314176245,
"acc_norm_stderr": 0.016951781383223313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364546,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364546
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.028320325830105908,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.028320325830105908
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.02780749004427621,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.02780749004427621
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3044328552803129,
"acc_stderr": 0.01175287759259757,
"acc_norm": 0.3044328552803129,
"acc_norm_stderr": 0.01175287759259757
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33088235294117646,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.33088235294117646,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4199346405228758,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.4199346405228758,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.03531987930208731,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.03531987930208731
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.03599335771456027,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.03599335771456027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.19583843329253367,
"mc1_stderr": 0.01389234436774209,
"mc2": 0.3016304809342682,
"mc2_stderr": 0.013699598037265183
},
"harness|winogrande|5": {
"acc": 0.6874506708760852,
"acc_stderr": 0.013027563620748835
},
"harness|drop|3": {
"em": 0.30557885906040266,
"em_stderr": 0.004717509363446725,
"f1": 0.36205327181208175,
"f1_stderr": 0.004656030495449622
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
projectbaraat/hindi-Mathematical | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 466401641
num_examples: 335642
download_size: 160851233
dataset_size: 466401641
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
openaccess-ai-collective/08af30fb2ef5c9f2c5c6fe12d1d572b1 | Invalid username or password. |
JosephLee/science_textbook_elementary_kor | ---
language:
- ko
task_categories:
- question-answering
tags:
- testbook
- elementary
- science
--- |
bigscience-data/roots_indic-ml_indic_nlp_corpus | ---
language: ml
license: cc-by-nc-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-ml_indic_nlp_corpus
# Indic NLP Corpus
- Dataset uid: `indic_nlp_corpus`
### Description
The IndicNLP corpus is a largescale, general-domain corpus containing 2.7 billion words for 10 Indian languages from two language families. s (IndoAryan branch and Dravidian). Each language has at least 100 million words (except Oriya).
### Homepage
https://github.com/AI4Bharat/indicnlp_corpus#publicly-available-classification-datasets
### Licensing
- non-commercial use
- cc-by-nc-sa-4.0: Creative Commons Attribution Non Commercial Share Alike 4.0 International
### Speaker Locations
- Southern Asia
- India
### Sizes
- 3.4019 % of total
- 44.4368 % of indic-hi
- 64.2943 % of indic-ta
- 70.5374 % of indic-ml
- 54.2394 % of indic-te
- 55.9105 % of indic-kn
- 61.6111 % of indic-mr
- 67.2242 % of indic-pa
- 68.1470 % of indic-or
- 64.3879 % of indic-gu
- 4.1495 % of indic-bn
### BigScience processing steps
#### Filters applied to: indic-hi
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: indic-gu
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
assq/11 | ---
license: cc0-1.0
---
|
jon-tow/okapi_arc_challenge | ---
language:
- ar
- bn
- ca
- da
- de
- es
- eu
- fr
- gu
- hi
- hr
- hu
- hy
- id
- it
- kn
- ml
- mr
- ne
- nl
- pt
- ro
- ru
- sk
- sr
- sv
- ta
- te
- uk
- vi
license: cc-by-nc-4.0
---
# okapi_arc_challenge
<!-- Provide a quick summary of the dataset. -->
Multilingual translation of [AI2's Arc Challenge](https://allenai.org/data/arc) from the paper *"Okapi: Instruction-tuned Large Language Models in Multiple Languages with Reinforcement Learning from Human Feedback"* ([Lai et al., 2023](https://arxiv.org/abs/2307.16039))
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
ARC is a dataset of 7,787 genuine grade-school level, multiple-choice science questions assembled to encourage research in
advanced question-answering. The dataset is partitioned into a Challenge Set and an Easy Set, where the former contains
only questions answered incorrectly by both a retrieval-based algorithm and a word co-occurrence algorithm. We also
include a corpus of over 14 million science sentences relevant to the task and an implementation of three neural baseline models for this dataset.
We pose ARC as a challenge to the community.
- **Curated by:** Dac Lai, Viet and Van Nguyen, Chien and Ngo, Nghia Trung and Nguyen, Thuat and Dernoncourt, Franck and Rossi, Ryan A and Nguyen, Thien Huu
- **License:** The datasets are CC BY NC 4.0 (allowing only non-commercial use).
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** http://nlp.uoregon.edu/download/okapi-eval/datasets/
- **Paper:** Okapi ([Lai et al., 2023](https://arxiv.org/abs/2307.16039))
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```bibtex
@article{dac2023okapi,
title={Okapi: Instruction-tuned Large Language Models in Multiple Languages with Reinforcement Learning from Human Feedback},
author={Dac Lai, Viet and Van Nguyen, Chien and Ngo, Nghia Trung and Nguyen, Thuat and Dernoncourt, Franck and Rossi, Ryan A and Nguyen, Thien Huu},
journal={arXiv e-prints},
pages={arXiv--2307},
year={2023}
}
```
```bibtex
@article{Clark2018ThinkYH,
title={Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge},
author={Peter Clark and Isaac Cowhey and Oren Etzioni and Tushar Khot and Ashish Sabharwal and Carissa Schoenick and Oyvind Tafjord},
journal={ArXiv},
year={2018},
volume={abs/1803.05457}
}
```
|
open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b | ---
pretty_name: Evaluation run of PocketDoc/Dans-TotSirocco-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-TotSirocco-7b](https://huggingface.co/PocketDoc/Dans-TotSirocco-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T12:54:48.005243](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b/blob/main/results_2023-10-23T12-54-48.005243.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.44997902684563756,\n\
\ \"em_stderr\": 0.00509477973209699,\n \"f1\": 0.49544777684563845,\n\
\ \"f1_stderr\": 0.00490923385938236,\n \"acc\": 0.45978722729484023,\n\
\ \"acc_stderr\": 0.01042644341108249\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.44997902684563756,\n \"em_stderr\": 0.00509477973209699,\n\
\ \"f1\": 0.49544777684563845,\n \"f1_stderr\": 0.00490923385938236\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1326762699014405,\n \
\ \"acc_stderr\": 0.009343929131442216\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722764\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-TotSirocco-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|arc:challenge|25_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|arc:challenge|25_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T12_54_48.005243
path:
- '**/details_harness|drop|3_2023-10-23T12-54-48.005243.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T12-54-48.005243.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T12_54_48.005243
path:
- '**/details_harness|gsm8k|5_2023-10-23T12-54-48.005243.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T12-54-48.005243.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hellaswag|10_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hellaswag|10_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T23-41-30.846721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T23-41-30.846721.parquet'
- split: 2023_10_10T03_08_42.670420
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T03-08-42.670420.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T03-08-42.670420.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T12_54_48.005243
path:
- '**/details_harness|winogrande|5_2023-10-23T12-54-48.005243.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T12-54-48.005243.parquet'
- config_name: results
data_files:
- split: 2023_10_09T23_41_30.846721
path:
- results_2023-10-09T23-41-30.846721.parquet
- split: 2023_10_10T03_08_42.670420
path:
- results_2023-10-10T03-08-42.670420.parquet
- split: 2023_10_23T12_54_48.005243
path:
- results_2023-10-23T12-54-48.005243.parquet
- split: latest
path:
- results_2023-10-23T12-54-48.005243.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-TotSirocco-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-TotSirocco-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-TotSirocco-7b](https://huggingface.co/PocketDoc/Dans-TotSirocco-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T12:54:48.005243](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-TotSirocco-7b/blob/main/results_2023-10-23T12-54-48.005243.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.44997902684563756,
"em_stderr": 0.00509477973209699,
"f1": 0.49544777684563845,
"f1_stderr": 0.00490923385938236,
"acc": 0.45978722729484023,
"acc_stderr": 0.01042644341108249
},
"harness|drop|3": {
"em": 0.44997902684563756,
"em_stderr": 0.00509477973209699,
"f1": 0.49544777684563845,
"f1_stderr": 0.00490923385938236
},
"harness|gsm8k|5": {
"acc": 0.1326762699014405,
"acc_stderr": 0.009343929131442216
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_comparative_than | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 783
num_examples: 4
- name: train
num_bytes: 1046
num_examples: 5
download_size: 0
dataset_size: 1829
---
# Dataset Card for "MULTI_VALUE_stsb_comparative_than"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_baseline_v5_full_recite_full_passage_random_permute_rerun_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4369231.0
num_examples: 2385
- name: validation
num_bytes: 573308
num_examples: 300
download_size: 1012407
dataset_size: 4942539.0
---
# Dataset Card for "squad_qa_baseline_v5_full_recite_full_passage_random_permute_rerun_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sunbird/Synthetic-Salt-Luganda-13-6-23 | ---
dataset_info:
features:
- name: audio
sequence:
sequence: float32
- name: sample_rate
dtype: int64
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 8360315972
num_examples: 25000
download_size: 8282006533
dataset_size: 8360315972
---
# Dataset Card for "Synthetic-Salt-Luganda-13-6-23"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_indef_one | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2270650
num_examples: 13219
- name: test
num_bytes: 23668466
num_examples: 137338
- name: train
num_bytes: 20579019
num_examples: 119383
download_size: 28922383
dataset_size: 46518135
---
# Dataset Card for "MULTI_VALUE_qqp_indef_one"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edf23/bash | ---
license: openrail
---
|
lighteval/aimo_progress_prize_1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: problem
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 2441
num_examples: 10
download_size: 4258
dataset_size: 2441
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-bba54b81-5330-48f8-b7bf-1cb797f93bcf-5246 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
To run new evaluation jobs, visit Hugging Face's [automatic evaluation service](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
glaiveai/glaive-code-assistant-v2 | ---
license: apache-2.0
size_categories:
- 100K<n<1M
tags:
- code
- synthetic
---
# Glaive-code-assistant-v2
Glaive-code-assistant-v2 is a dataset of ~215k code problems and solutions generated using Glaive’s synthetic data generation platform.
This is built on top of the previous version of the dataset that can be found [here](https://huggingface.co/datasets/glaiveai/glaive-code-assistant)
To report any problems or suggestions in the data, join the [Glaive discord](https://discord.gg/fjQ4uf3yWD) |
Augusto777/dmae-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': avanzada
'1': leve
'2': moderada
'3': no amd
splits:
- name: train
num_bytes: 48967077.0
num_examples: 40
- name: test
num_bytes: 16065989.0
num_examples: 16
- name: validation
num_bytes: 15887796.0
num_examples: 16
download_size: 80912022
dataset_size: 80920862.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "dmae-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TarlewBR/vozalex | ---
license: openrail
---
|
ctoraman/atis-ner-turkish | ---
license: cc-by-nc-sa-4.0
task_categories:
- token-classification
language:
- tr
tags:
- named entity recognition
- ner
- atis
- utterance
- spoken query
---
The ATIS (Airline Travel Information System) Dataset includes spoken queries (i.e., utterances) annotated for the task of slot filling in conversational systems.
This dataset, ATISNER, includes airline spoken queries translated from English to Turkish, customized for Named Entity Recognition.
Train and test splits include 4,978 and 890 sentences, respectively.
Translations are provided by the following study.
Şahinuç, F., Yücesoy, V., & Koç, A. (2020). Intent Classification and Slot Filling for Turkish Dialogue Systems. In 2020 28th signal processing and communications applications conference (pp. 1–4).
Github Repo: https://github.com/avaapm/TurkishNamedEntityRecognition/
# If you would like to use any material in this repository, please cite the following paper:
Oguzhan Ozcelik and Cagri Toraman. 2022. Named entity recognition in Turkish: A comparative study with detailed error analysis. Inf. Process. Manage. 59, 6 (Nov 2022). https://doi.org/10.1016/j.ipm.2022.103065 |
jytole/AnimalAudio | ---
license: cc
tags:
- audio
- generation
Source: Museum für Naturkunde Berlin. Animal Sound Archive. Occurrence dataset https://doi.org/10.15468/0bpalr accessed via GBIF.org on 2023-06-20.
---
A dataset to fine-tune the AudioLDM Audio Generation Model |
jeggers/codingame | ---
license: cc-by-sa-3.0
---
This dataset is scraped from [codingame](https://www.codingame.com/home). Please check them out.
It contains coding problems with description, example input and output as well as about 5 test cases and 5 validator test cases per problem.
Because of the nature of the test cases defining only input and output, this is completly language agnostic and automatically verifiable.
This dataset is under Creative Commons Attribution Share Alike 3.0 license.
The dataset contains different types of puzzles:
| Game Type | Number of puzzles |
|-----------------------------|-------------------|
| Clash of Code | 2657 |
| Classic Puzzle | 680 |
| Multiplayer Game (only url) | 49 |
| Solo Game (only url) | 25 |
| Optimization Game (only url)| 11 |
For the last three categories: Multiplayer Game, Solo Game and Optimization Game, there is only the URL available.
Note that only Clash of Code has gamemodes and only Classic Puzzle has difficulty attribute.
There are three different possible gamemodes: Fastest, Shortest, Reverse. In the original multi player clash of code they work as follows:
In the fastest mode, the player who submits a working solution first wins. In Reverse mode, the goal is also to be the fastest, but instead of the problem statement,
you get the the validator inputs and outputs. You tasks is to understand the pattern and write the code that also works for the test cases. In Shortest mode, the goal is
to submit working code with fewest character count in any programming language. Whoever submits the shortest working code wins.
In all three gamemodes the own code can be tested on the validator cases before submitting. The input and output of the validator cases are also accessable.
When you submit, the score is calculated with the actual test cases.
Here is a list of the difficulties for the Classic Puzzles:
| Difficulty | Number of Challenges |
|------------|----------------------|
| Medium | 257 |
| Easy | 231 |
| Hard | 92 |
| Very Hard | 19 |
| Undefined | 81 |
For some Puzzles the fields example_in, example_out or constraints are missing, but the problems should be solvable without.
|
khoomeik/satscale-sr-600 | ---
dataset_info:
features:
- name: name
dtype: string
- name: n_vars
dtype: int64
- name: n_clauses
dtype: int64
- name: clauses
sequence:
sequence: int64
- name: marginals
sequence: float64
- name: assignments
sequence: int64
splits:
- name: train
num_bytes: 2578688
num_examples: 600
- name: valid
num_bytes: 847400
num_examples: 200
- name: test
num_bytes: 832712
num_examples: 200
download_size: 475305
dataset_size: 4258800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.