datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
abhayzala/VPEval | ---
license: mit
---
|
open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-c-v1 | ---
pretty_name: Evaluation run of juhwanlee/llmdo-Mistral-7B-case-c-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [juhwanlee/llmdo-Mistral-7B-case-c-v1](https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-c-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-c-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T04:58:19.473029](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-c-v1/blob/main/results_2024-03-04T04-58-19.473029.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253386360949732,\n\
\ \"acc_stderr\": 0.032614370983831034,\n \"acc_norm\": 0.63033426016995,\n\
\ \"acc_norm_stderr\": 0.033275501464125654,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.01625524199317918,\n \"mc2\": 0.4582292996496707,\n\
\ \"mc2_stderr\": 0.014699504590807561\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6358295160326628,\n\
\ \"acc_stderr\": 0.0048021335116542364,\n \"acc_norm\": 0.8354909380601474,\n\
\ \"acc_norm_stderr\": 0.003699791934754368\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.0420392104015628,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.0420392104015628\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n\
\ \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n\
\ \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"\
acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391534,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n\
\ \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n\
\ \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.01625524199317918,\n \"mc2\": 0.4582292996496707,\n\
\ \"mc2_stderr\": 0.014699504590807561\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.01143045004588158\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3980288097043215,\n \
\ \"acc_stderr\": 0.013483026939074818\n }\n}\n```"
repo_url: https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-c-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|arc:challenge|25_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|gsm8k|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hellaswag|10_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T04-58-19.473029.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T04-58-19.473029.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- '**/details_harness|winogrande|5_2024-03-04T04-58-19.473029.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T04-58-19.473029.parquet'
- config_name: results
data_files:
- split: 2024_03_04T04_58_19.473029
path:
- results_2024-03-04T04-58-19.473029.parquet
- split: latest
path:
- results_2024-03-04T04-58-19.473029.parquet
---
# Dataset Card for Evaluation run of juhwanlee/llmdo-Mistral-7B-case-c-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [juhwanlee/llmdo-Mistral-7B-case-c-v1](https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-c-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-c-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T04:58:19.473029](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-c-v1/blob/main/results_2024-03-04T04-58-19.473029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6253386360949732,
"acc_stderr": 0.032614370983831034,
"acc_norm": 0.63033426016995,
"acc_norm_stderr": 0.033275501464125654,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317918,
"mc2": 0.4582292996496707,
"mc2_stderr": 0.014699504590807561
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6358295160326628,
"acc_stderr": 0.0048021335116542364,
"acc_norm": 0.8354909380601474,
"acc_norm_stderr": 0.003699791934754368
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.0420392104015628,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.0420392104015628
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391534,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056215,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056215
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317918,
"mc2": 0.4582292996496707,
"mc2_stderr": 0.014699504590807561
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.01143045004588158
},
"harness|gsm8k|5": {
"acc": 0.3980288097043215,
"acc_stderr": 0.013483026939074818
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_jan-hq__stealth-v1.3 | ---
pretty_name: Evaluation run of jan-hq/stealth-v1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jan-hq/stealth-v1.3](https://huggingface.co/jan-hq/stealth-v1.3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__stealth-v1.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T13:33:32.733968](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.3/blob/main/results_2024-03-01T13-33-32.733968.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6489306644624384,\n\
\ \"acc_stderr\": 0.032117814539989575,\n \"acc_norm\": 0.6488111440199534,\n\
\ \"acc_norm_stderr\": 0.03278124580734838,\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5571199691389221,\n\
\ \"mc2_stderr\": 0.015289284314943528\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.014012883334859859,\n\
\ \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729122\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6824337781318462,\n\
\ \"acc_stderr\": 0.00464578304800458,\n \"acc_norm\": 0.8673571001792472,\n\
\ \"acc_norm_stderr\": 0.003384951803213478\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406793,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406793\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.01651959427529712,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.01651959427529712\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5571199691389221,\n\
\ \"mc2_stderr\": 0.015289284314943528\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491906\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \
\ \"acc_stderr\": 0.01242507818839599\n }\n}\n```"
repo_url: https://huggingface.co/jan-hq/stealth-v1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|arc:challenge|25_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|arc:challenge|25_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|gsm8k|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|gsm8k|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hellaswag|10_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hellaswag|10_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-33-32.733968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T13-33-32.733968.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- '**/details_harness|winogrande|5_2024-01-14T07-33-07.818995.parquet'
- split: 2024_03_01T13_33_32.733968
path:
- '**/details_harness|winogrande|5_2024-03-01T13-33-32.733968.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T13-33-32.733968.parquet'
- config_name: results
data_files:
- split: 2024_01_14T07_33_07.818995
path:
- results_2024-01-14T07-33-07.818995.parquet
- split: 2024_03_01T13_33_32.733968
path:
- results_2024-03-01T13-33-32.733968.parquet
- split: latest
path:
- results_2024-03-01T13-33-32.733968.parquet
---
# Dataset Card for Evaluation run of jan-hq/stealth-v1.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.3](https://huggingface.co/jan-hq/stealth-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__stealth-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T13:33:32.733968](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.3/blob/main/results_2024-03-01T13-33-32.733968.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6489306644624384,
"acc_stderr": 0.032117814539989575,
"acc_norm": 0.6488111440199534,
"acc_norm_stderr": 0.03278124580734838,
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5571199691389221,
"mc2_stderr": 0.015289284314943528
},
"harness|arc:challenge|25": {
"acc": 0.6416382252559727,
"acc_stderr": 0.014012883334859859,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729122
},
"harness|hellaswag|10": {
"acc": 0.6824337781318462,
"acc_stderr": 0.00464578304800458,
"acc_norm": 0.8673571001792472,
"acc_norm_stderr": 0.003384951803213478
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406793,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406793
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507337,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.01651959427529712,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.01651959427529712
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5571199691389221,
"mc2_stderr": 0.015289284314943528
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491906
},
"harness|gsm8k|5": {
"acc": 0.7156937073540561,
"acc_stderr": 0.01242507818839599
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chanchal21/mini_platypus_two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mingaflo/rebel-dataset-de | ---
license: mit
task_categories:
- summarization
language:
- de
tags:
- wikipedia
- wikidata
- Relation Extraction
- REBEL
pretty_name: German REBEL Dataset
size_categories:
- 100K<n<1M
---
# Dataset Card for German REBEL Dataset
### Dataset Summary
This dataset is the German version of Babelscape/rebel-dataset. It has been generated using [CROCODILE](https://github.com/Babelscape/crocodile).
The Wikipedia Version is from November 2022.
### Languages
- German
## Dataset Structure
```
{"docid": "9400003",
"title": "Odin-Gletscher",
"uri": "Q7077818",
"text": "Der Odin-Gletscher ist ein kleiner Gletscher im ostantarktischen Viktorialand. Er fließt von den Westhängen des Mount Odin in der Asgard Range.\n\nDas New Zealand Antarctic Place-Names Committee benannte ihn in Anlehnung an die Benennung des Mount Odin nach Odin, Göttervater, Kriegs- und Totengott der nordischen Mythologie.",
"entities": [{"uri": "Q35666", "boundaries": [35, 44], "surfaceform": "Gletscher", "annotator": "Me"}, ... ],
"triples": [{"subject": {"uri": "Q7077818", "boundaries": [4, 18], "surfaceform": "Odin-Gletscher", "annotator": "Me"},
"predicate": {"uri": "P31", "boundaries": null, "surfaceform": "ist ein(e)", "annotator": "NoSubject-Triple-aligner"},
"object": {"uri": "Q35666", "boundaries": [35, 44], "surfaceform": "Gletscher", "annotator": "Me"}, "sentence_id": 0,
"dependency_path": null,
"confidence": 0.99560546875,
"annotator": "NoSubject-Triple-aligner"}, ...]
}
```
### Data Instances
The dataset is 1.1GB if unpacked on the system. 195MB if zipped.
### Data Fields
"docid": "9644601",
"title": Wikipedia Title
"uri": "Q4290759",
"text": Wikipedia Abstract
"entities": A list of Entities
- uri: Wikidata URI
- boundaries: Tuple of indices of the entity in the abstract
- surfaceform: text form of entity
- annotator: different annotator classes
"triples": List of Triples as dictionaries
- sentence_id: Sentence number the triple appears in.
- "confidence": float, the confidence of the NLI Model
- subject
- uri: Wikidata Entity URI
- boundaries
- surfaceform
- annotator
- predicate
- uri: Wikidata Relation URI
- boundaries: always null,
- surfaceform: Wikidata Relation Name
- annotator
- object:
- uri: Wikidata Entity URI
- boundaries
- surfaceform
- annotator
### Data Splits
No splits are provided for now since the relation classes are quite imbalanced.
To read the dataset you can adapt the function provided by https://github.com/Babelscape/rebel
```
def _generate_examples(self, filepath):
"""This function returns the examples in the raw (text) form."""
logging.info("generating examples from = %s", filepath)
relations_df = pd.read_csv(self.config.data_files['relations'], header = None, sep='\t')
relations = list(relations_df[0])
with open(filepath, encoding="utf-8") as f:
for id_, row in enumerate(f):
article = json.loads(row)
prev_len = 0
if len(article['triples']) == 0:
continue
count = 0
for text_paragraph in article['text'].split('\n'):
if len(text_paragraph) == 0:
continue
sentences = re.split(r'(?<=[.])\s', text_paragraph)
text = ''
for sentence in sentences:
text += sentence + ' '
if any([entity['boundaries'][0] < len(text) + prev_len < entity['boundaries'][1] for entity in article['entities']]):
continue
entities = sorted([entity for entity in article['entities'] if prev_len < entity['boundaries'][1] <= len(text)+prev_len], key=lambda tup: tup['boundaries'][0])
decoder_output = '<triplet> '
for int_ent, entity in enumerate(entities):
triplets = sorted([triplet for triplet in article['triples'] if triplet['subject'] == entity and prev_len< triplet['subject']['boundaries'][1]<=len(text) + prev_len and prev_len< triplet['object']['boundaries'][1]<=len(text)+ prev_len and triplet['predicate']['surfaceform'] in relations], key=lambda tup: tup['object']['boundaries'][0])
if len(triplets) == 0:
continue
decoder_output += entity['surfaceform'] + ' <subj> '
for triplet in triplets:
decoder_output += triplet['object']['surfaceform'] + ' <obj> ' + triplet['predicate']['surfaceform'] + ' <subj> '
decoder_output = decoder_output[:-len(' <subj> ')]
decoder_output += ' <triplet> '
decoder_output = decoder_output[:-len(' <triplet> ')]
count += 1
prev_len += len(text)
if len(decoder_output) == 0:
text = ''
continue
text = re.sub('([\[\].,!?()])', r' \1 ', text.replace('()', ''))
text = re.sub('\s{2,}', ' ', text)
yield article['uri'] + '-' + str(count), {
"title": article['title'],
"context": text,
"id": article['uri'] + '-' + str(count),
"triplets": decoder_output,
}
text = ''
```
## Dataset Creation
### Curation Rationale
This dataset was created to enable the training of a german BART based model as pre-training phase for Relation Extraction.
### Source Data
#### Who are the source language producers?
Any Wikipedia and Wikidata contributor.
### Annotations
#### Annotation process
The dataset extraction pipeline cRocoDiLe: Automatic Relation Extraction Dataset with NLI filtering.
#### Who are the annotators?
Automatic annottations
### Personal and Sensitive Information
All text is from Wikipedia, any Personal or Sensitive Information there may be present in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset serves as a pre-training step for Relation Extraction models. It is distantly annotated, hence it should only be used as such. A model trained solely on this dataset may produce allucinations coming from the silver nature of the dataset.
### Discussion of Biases
Since the dataset was automatically created from Wikipedia and Wikidata, it may reflect the biases withing those sources.
For Wikipedia text, see for example Dinan et al 2020 on biases in Wikipedia (esp. Table 1), or Blodgett et al 2020 for a more general discussion of the topic.
For Wikidata, there are class imbalances, also resulting from Wikipedia.
### Other Known Limitations
Not for now
## Additional Information
### Dataset Curators
Me
### Licensing Information
Since anyone can create the dataset on their own using the linked GitHub Repository, I am going to use the MIT Licence.
### Citation Information
Inspiration by:
```
@inproceedings{huguet-cabot-navigli-2021-rebel,
title = "REBEL: Relation Extraction By End-to-end Language generation",
author = "Huguet Cabot, Pere-Llu{\'\i}s and
Navigli, Roberto",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
month = nov,
year = "2021",
address = "Online and in the Barceló Bávaro Convention Centre, Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://github.com/Babelscape/rebel/blob/main/docs/EMNLP_2021_REBEL__Camera_Ready_.pdf",
}
```
### Contributions
None for now |
saarus72/ficbook_text_norm | ---
license: apache-2.0
task_categories:
- text-generation
language:
- ru
size_categories:
- 1M<n<10M
---
Texts inverse normalized obtained from [ficbook](https://huggingface.co/datasets/IlyaGusev/ficbook) dataset.
Normalized using [these notebooks](https://github.com/saarus72/text_normalization) for a personal [russian normalization model](https://huggingface.co/saarus72/russian_text_normalizer) (avaliable on [HF Space](https://huggingface.co/spaces/saarus72/russian-text-normalization) as well).
All put into single `jsonl` file with lines like (beautified):
```json
{
"replaces": [
{
"text_from": "Боль во всем теле...Боже...я так и знала.... ",
"text_to": "Боль во всем теле...Боже...я так и знала.... "
},
{
"text_from": "5",
"text_to": "Пятая"
},
{
"text_from": " точка буквально ныла от боли, ушиб будет тот еще...садиться не смогу дня ",
"text_to": " точка буквально ныла от боли, ушиб будет тот еще...садиться не смогу дня "
},
{
"text_from": "3",
"text_to": "три"
},
{
"text_from": " так, точно.",
"text_to": " так, точно."
}
]
}
```
|
AdapterOcean/med_alpaca_standardized_cluster_89 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 32430824
num_examples: 3456
download_size: 8685630
dataset_size: 32430824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_89"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pokce/NewYorkTimes00-07 | ---
dataset_info:
- config_name: NYT_length128
features:
- name: id
dtype: string
- name: year
dtype: string
- name: month
dtype: string
- name: day
dtype: string
- name: title
dtype: string
- name: lead_paragraph
dtype: string
- name: input
dtype: string
splits:
- name: NYT_length128
num_bytes: 805898669
num_examples: 497220
download_size: 543799180
dataset_size: 805898669
- config_name: NYT_length256
features:
- name: id
dtype: string
- name: year
dtype: string
- name: month
dtype: string
- name: day
dtype: string
- name: title
dtype: string
- name: lead_paragraph
dtype: string
- name: input
dtype: string
splits:
- name: NYT_length256
num_bytes: 986506429
num_examples: 407952
download_size: 656741234
dataset_size: 986506429
- config_name: NYT_length32
features:
- name: id
dtype: string
- name: year
dtype: string
- name: month
dtype: string
- name: day
dtype: string
- name: title
dtype: string
- name: lead_paragraph
dtype: string
- name: input
dtype: string
splits:
- name: NYT_length32
num_bytes: 632875181
num_examples: 659496
download_size: 427035594
dataset_size: 632875181
- config_name: NYT_length512
features:
- name: id
dtype: string
- name: year
dtype: string
- name: month
dtype: string
- name: day
dtype: string
- name: title
dtype: string
- name: lead_paragraph
dtype: string
- name: input
dtype: string
splits:
- name: NYT_length512
num_bytes: 1255048663
num_examples: 317512
download_size: 818477539
dataset_size: 1255048663
- config_name: NYT_length64
features:
- name: id
dtype: string
- name: year
dtype: string
- name: month
dtype: string
- name: day
dtype: string
- name: title
dtype: string
- name: lead_paragraph
dtype: string
- name: input
dtype: string
splits:
- name: NYT_length64
num_bytes: 715550864
num_examples: 600752
download_size: 484584815
dataset_size: 715550864
configs:
- config_name: NYT_length128
data_files:
- split: NYT_length128
path: NYT_length128/NYT_length128-*
- config_name: NYT_length256
data_files:
- split: NYT_length256
path: NYT_length256/NYT_length256-*
- config_name: NYT_length32
data_files:
- split: NYT_length32
path: NYT_length32/NYT_length32-*
- config_name: NYT_length512
data_files:
- split: NYT_length512
path: NYT_length512/NYT_length512-*
- config_name: NYT_length64
data_files:
- split: NYT_length64
path: NYT_length64/NYT_length64-*
---
|
chrisjay/ratman-stopword-lists-for-african-languages | ---
license:
- other
kaggle_id: rtatman/stopword-lists-for-african-languages
---
# Dataset Card for Stopword Lists for African Languages
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://kaggle.com/datasets/rtatman/stopword-lists-for-african-languages
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
### Context:
Some words, like “the” or “and” in English, are used a lot in speech and writing. For most Natural Language Processing applications, you will want to remove these very frequent words. This is usually done using a list of “stopwords” which has been complied by hand.
### Content:
This project uses the source texts provided by the African Storybook Project as a corpus and provides a number of tools to extract frequency lists and lists of stopwords from this corpus for the 60+ languages covered by ASP.
Included in this dataset are the following languages:
* Afrikaans: stoplist and word frequency
* Hausa: stoplist and word frequency
* Lugbarati: word frequency only
* Lugbarati (Official): word frequency only
* Somali: stoplist and word frequency
* Sesotho: stoplist and word frequency
* Kiswahili: stoplist and word frequency
* Yoruba: stoplist and word frequency
* isiZulu: stoplist and word frequency
Files are named using the language’s ISO code. For each language, code.txt is the list of stopwords, and code_frequency_list.txt is word frequency information. A list of ISO codes the the languages associated with them may be found in ISO_codes.csv.
### Acknowledgements:
This project therefore attempts to fill in the gap in language coverage for African language stoplists by using the freely-available and open-licensed ASP Source project as a corpus.
Dual-licensed under CC-BY and Apache-2.0 license. Compiled by Liam Doherty. More information and the scripts used to generate these files are available [here](https://github.com/dohliam/more-stoplists).
### Inspiration:
This dataset is mainly helpful for use during NLP analysis, however there may some interesting insights in the data.
* What qualities do stopwords share across languages? Given a novel language, could you predict what its stopwords should be?
* What stopwords are shared across languages?
* Often, related languages will have words with the same meaning and similar spellings. Can you automatically identify any of these pairs of words?
### You may also like:
* [Stopword Lists for 19 Languages (mainly European and South Asian)](https://www.kaggle.com/rtatman/stopword-lists-for-19-languages)
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by [@rtatman](https://kaggle.com/rtatman)
### Licensing Information
The license for this dataset is other
### Citation Information
```bibtex
[More Information Needed]
```
### Contributions
[More Information Needed] |
linda996/768782 | ---
license: openrail
---
|
DylanonWic/common_voice_10_1_th_clean_split_3_augment | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: labels
sequence: int64
- name: input_values
sequence: float32
splits:
- name: train
num_bytes: 12080449267
num_examples: 50530
download_size: 12068906392
dataset_size: 12080449267
---
# Dataset Card for "common_voice_10_1_th_clean_split_3_augment_fix_spacial_char"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ryan-Pupia/CS482-TaxiDataSetTest | ---
dataset_info:
features:
- name: key
dtype: string
- name: pickup_datetime
dtype: string
- name: pickup_longitude
dtype: float64
- name: pickup_latitude
dtype: float64
- name: dropoff_longitude
dtype: float64
- name: dropoff_latitude
dtype: float64
- name: passenger_count
dtype: int64
splits:
- name: test
num_bytes: 977751
num_examples: 9914
download_size: 521219
dataset_size: 977751
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
explodinggradients/prompt-engineering-papers | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe | ---
pretty_name: Evaluation run of Vasanth/Valor_Macaroni_moe
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Vasanth/Valor_Macaroni_moe](https://huggingface.co/Vasanth/Valor_Macaroni_moe)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T12:08:54.379956](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe/blob/main/results_2024-01-21T12-08-54.379956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6507794825269755,\n\
\ \"acc_stderr\": 0.03207408539778976,\n \"acc_norm\": 0.650451847875648,\n\
\ \"acc_norm_stderr\": 0.032740448600963236,\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6464987959765103,\n\
\ \"mc2_stderr\": 0.015375185619864559\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518822,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725223\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6838279227245568,\n\
\ \"acc_stderr\": 0.004640306719628063,\n \"acc_norm\": 0.8661621190997809,\n\
\ \"acc_norm_stderr\": 0.003397822089857292\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.035506839891655796,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.035506839891655796\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303954,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303954\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139967,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139967\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6464987959765103,\n\
\ \"mc2_stderr\": 0.015375185619864559\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359244\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \
\ \"acc_stderr\": 0.012522795894420869\n }\n}\n```"
repo_url: https://huggingface.co/Vasanth/Valor_Macaroni_moe
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|arc:challenge|25_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|gsm8k|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hellaswag|10_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T12-08-54.379956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- '**/details_harness|winogrande|5_2024-01-21T12-08-54.379956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T12-08-54.379956.parquet'
- config_name: results
data_files:
- split: 2024_01_21T12_08_54.379956
path:
- results_2024-01-21T12-08-54.379956.parquet
- split: latest
path:
- results_2024-01-21T12-08-54.379956.parquet
---
# Dataset Card for Evaluation run of Vasanth/Valor_Macaroni_moe
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Vasanth/Valor_Macaroni_moe](https://huggingface.co/Vasanth/Valor_Macaroni_moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T12:08:54.379956](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe/blob/main/results_2024-01-21T12-08-54.379956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6507794825269755,
"acc_stderr": 0.03207408539778976,
"acc_norm": 0.650451847875648,
"acc_norm_stderr": 0.032740448600963236,
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6464987959765103,
"mc2_stderr": 0.015375185619864559
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518822,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725223
},
"harness|hellaswag|10": {
"acc": 0.6838279227245568,
"acc_stderr": 0.004640306719628063,
"acc_norm": 0.8661621190997809,
"acc_norm_stderr": 0.003397822089857292
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.035506839891655796,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.035506839891655796
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303954,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303954
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139967,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139967
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4920440636474908,
"mc1_stderr": 0.01750128507455183,
"mc2": 0.6464987959765103,
"mc2_stderr": 0.015375185619864559
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359244
},
"harness|gsm8k|5": {
"acc": 0.7081122062168309,
"acc_stderr": 0.012522795894420869
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ch08931/Andreza | ---
license: openrail
---
|
SEACrowd/indonli | ---
tags:
- textual-entailment
language:
- ind
---
# indonli
This dataset is designed for Natural Language Inference NLP task. It is designed to provide a challenging test-bed
for Indonesian NLI by explicitly incorporating various linguistic phenomena such as numerical reasoning, structural
changes, idioms, or temporal and spatial reasoning.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{mahendra-etal-2021-indonli,
title = "{I}ndo{NLI}: A Natural Language Inference Dataset for {I}ndonesian",
author = "Mahendra, Rahmad and Aji, Alham Fikri and Louvan, Samuel and Rahman, Fahrurrozi and Vania, Clara",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.821",
pages = "10511--10527",
}
```
## License
Creative Common Attribution Share-Alike 4.0 International
## Homepage
[https://github.com/ir-nlp-csui/indonli](https://github.com/ir-nlp-csui/indonli)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
sumanlama2000/amazon-shoe-reviews | ---
dataset_info:
features:
- name: labels
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 16847665.2
num_examples: 90000
- name: test
num_bytes: 1871962.8
num_examples: 10000
download_size: 11141108
dataset_size: 18719628.0
---
# Dataset Card for "amazon-shoe-reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mugithi/down_sampled_file_traffic_dataset | ---
dataset_info:
features:
- name: avgSpeed
dtype: int64
- name: junction
dtype: int64
- name: TIMESTAMP
dtype: string
- name: vehicleCount
dtype: int64
- name: Year
dtype: int64
- name: Month
dtype: int64
- name: Date_no
dtype: int64
- name: Hour
dtype: int64
- name: Day
dtype: string
splits:
- name: train
num_bytes: 979111222
num_examples: 10861705
download_size: 82941334
dataset_size: 979111222
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zuoyexifeng/med_ft_1000 | ---
license: apache-2.0
---
|
CyberHarem/junyou_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of junyou/隼鷹 (Kantai Collection)
This is the dataset of junyou/隼鷹 (Kantai Collection), containing 442 images and their tags.
The core tags of this character are `purple_hair, long_hair, spiked_hair, purple_eyes, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 442 | 380.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junyou_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 442 | 269.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junyou_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 914 | 516.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junyou_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 442 | 357.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junyou_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 914 | 652.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junyou_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/junyou_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blouse, dress_shirt, hakama_pants, magatama, onmyouji, red_hakama, shikigami, smile, solo, hakama_skirt, red_pants, looking_at_viewer, scroll, simple_background, full_body, open_mouth, airplane, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, blouse, dress_shirt, hakama_pants, red_hakama, red_pants, smile, solo, blush, open_mouth, vest, magatama_earrings, onmyouji, scroll, shikigami, airplane, fire, looking_at_viewer |
| 2 | 6 |  |  |  |  |  | 1girl, blush, dress_shirt, drunk, smile, solo, hakama_pants, magatama, sake_bottle, sitting, open_mouth, red_hakama, red_pants, vest, blouse, looking_at_viewer |
| 3 | 5 |  |  |  |  |  | 1girl, magatama, smile, solo, japanese_clothes, onmyouji, scroll, shikigami, dress_shirt, looking_at_viewer, open_mouth |
| 4 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, magatama, solo, shirt, blush, grin, upper_body |
| 5 | 11 |  |  |  |  |  | 1girl, blush, solo, nude, partially_submerged, looking_at_viewer, water, onsen, smile, open_mouth, sake, bathing, steam, tokkuri, cleavage, choko_(cup), nipples, sakazuki |
| 6 | 5 |  |  |  |  |  | 2girls, blush, brown_hair, open_mouth, smile, magatama, cup, dress_shirt, ^_^ |
| 7 | 12 |  |  |  |  |  | playboy_bunny, 1girl, detached_collar, fake_animal_ears, rabbit_ears, solo, wrist_cuffs, cleavage, pantyhose, blush, leotard, looking_at_viewer, simple_background, bowtie, magatama, rabbit_tail, fishnets, grin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blouse | dress_shirt | hakama_pants | magatama | onmyouji | red_hakama | shikigami | smile | solo | hakama_skirt | red_pants | looking_at_viewer | scroll | simple_background | full_body | open_mouth | airplane | white_background | blush | vest | magatama_earrings | fire | drunk | sake_bottle | sitting | japanese_clothes | shirt | grin | upper_body | nude | partially_submerged | water | onsen | sake | bathing | steam | tokkuri | cleavage | choko_(cup) | nipples | sakazuki | 2girls | brown_hair | cup | ^_^ | playboy_bunny | detached_collar | fake_animal_ears | rabbit_ears | wrist_cuffs | pantyhose | leotard | bowtie | rabbit_tail | fishnets |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------------|:---------------|:-----------|:-----------|:-------------|:------------|:--------|:-------|:---------------|:------------|:--------------------|:---------|:--------------------|:------------|:-------------|:-----------|:-------------------|:--------|:-------|:--------------------|:-------|:--------|:--------------|:----------|:-------------------|:--------|:-------|:-------------|:-------|:----------------------|:--------|:--------|:-------|:----------|:--------|:----------|:-----------|:--------------|:----------|:-----------|:---------|:-------------|:------|:------|:----------------|:------------------|:-------------------|:--------------|:--------------|:------------|:----------|:---------|:--------------|:-----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | | X | X | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | | X | X | | | | X | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | X | X | | X | X | X | | | X | X | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | X | | | | | X | | | X | | | | | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | | | | | | | X | X | | | X | | | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | | X | | X | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | |
| 7 | 12 |  |  |  |  |  | X | | | | X | | | | | X | | | X | | X | | | | | X | | | | | | | | | X | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
KETI-AIR/kor_aeslc | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: data_index_by_user
dtype: int32
- name: subject_line
dtype: string
- name: email_body
dtype: string
splits:
- name: train
num_bytes: 11852680
num_examples: 14436
- name: validation
num_bytes: 1699457
num_examples: 1960
- name: test
num_bytes: 1539749
num_examples: 1906
download_size: 8634308
dataset_size: 15091886
license: cc-by-nc-sa-4.0
---
# Dataset Card for "kor_aeslc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{zhang-tetreault-2019-email,
title = "This Email Could Save Your Life: Introducing the Task of Email Subject Line Generation",
author = "Zhang, Rui and
Tetreault, Joel",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P19-1043",
doi = "10.18653/v1/P19-1043",
pages = "446--456",
}
``` |
open-llm-leaderboard/details_shadowml__DareBeagel-2x7B | ---
pretty_name: Evaluation run of shadowml/DareBeagel-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shadowml/DareBeagel-2x7B](https://huggingface.co/shadowml/DareBeagel-2x7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shadowml__DareBeagel-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T00:26:45.043532](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__DareBeagel-2x7B/blob/main/results_2024-01-17T00-26-45.043532.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651052782573929,\n\
\ \"acc_stderr\": 0.03216743677709112,\n \"acc_norm\": 0.6503711654008105,\n\
\ \"acc_norm_stderr\": 0.032838386718088884,\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6908761125222723,\n\
\ \"mc2_stderr\": 0.01507267076649574\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n\
\ \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.013119040897725922\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7109141605257917,\n\
\ \"acc_stderr\": 0.004524113671259701,\n \"acc_norm\": 0.8811989643497311,\n\
\ \"acc_norm_stderr\": 0.003228929916459686\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922435,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922435\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6908761125222723,\n\
\ \"mc2_stderr\": 0.01507267076649574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971859\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954772\n }\n}\n```"
repo_url: https://huggingface.co/shadowml/DareBeagel-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|arc:challenge|25_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|gsm8k|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hellaswag|10_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-26-45.043532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T00-26-45.043532.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- '**/details_harness|winogrande|5_2024-01-17T00-26-45.043532.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T00-26-45.043532.parquet'
- config_name: results
data_files:
- split: 2024_01_17T00_26_45.043532
path:
- results_2024-01-17T00-26-45.043532.parquet
- split: latest
path:
- results_2024-01-17T00-26-45.043532.parquet
---
# Dataset Card for Evaluation run of shadowml/DareBeagel-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shadowml/DareBeagel-2x7B](https://huggingface.co/shadowml/DareBeagel-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shadowml__DareBeagel-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T00:26:45.043532](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__DareBeagel-2x7B/blob/main/results_2024-01-17T00-26-45.043532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651052782573929,
"acc_stderr": 0.03216743677709112,
"acc_norm": 0.6503711654008105,
"acc_norm_stderr": 0.032838386718088884,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6908761125222723,
"mc2_stderr": 0.01507267076649574
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.01338502163731357,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.013119040897725922
},
"harness|hellaswag|10": {
"acc": 0.7109141605257917,
"acc_stderr": 0.004524113671259701,
"acc_norm": 0.8811989643497311,
"acc_norm_stderr": 0.003228929916459686
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922435,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922435
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6908761125222723,
"mc2_stderr": 0.01507267076649574
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971859
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chiggly007/semrush | ---
license: apache-2.0
---
|
Jiefuo/hierarchical-event-grounding | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b | ---
pretty_name: Evaluation run of fireballoon/baichuan-vicuna-chinese-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fireballoon/baichuan-vicuna-chinese-7b](https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T14:20:11.480532](https://huggingface.co/datasets/open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b/blob/main/results_2023-09-17T14-20-11.480532.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22242030201342283,\n\
\ \"em_stderr\": 0.00425891841660003,\n \"f1\": 0.2740048238255038,\n\
\ \"f1_stderr\": 0.004278992735739422,\n \"acc\": 0.3619266227972807,\n\
\ \"acc_stderr\": 0.00976430949757211\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.22242030201342283,\n \"em_stderr\": 0.00425891841660003,\n\
\ \"f1\": 0.2740048238255038,\n \"f1_stderr\": 0.004278992735739422\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \
\ \"acc_stderr\": 0.006298221796179566\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964655\n\
\ }\n}\n```"
repo_url: https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|arc:challenge|25_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T14_20_11.480532
path:
- '**/details_harness|drop|3_2023-09-17T14-20-11.480532.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T14-20-11.480532.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T14_20_11.480532
path:
- '**/details_harness|gsm8k|5_2023-09-17T14-20-11.480532.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T14-20-11.480532.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hellaswag|10_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T10:02:03.270696.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T10:02:03.270696.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T14_20_11.480532
path:
- '**/details_harness|winogrande|5_2023-09-17T14-20-11.480532.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T14-20-11.480532.parquet'
- config_name: results
data_files:
- split: 2023_08_10T10_02_03.270696
path:
- results_2023-08-10T10:02:03.270696.parquet
- split: 2023_09_17T14_20_11.480532
path:
- results_2023-09-17T14-20-11.480532.parquet
- split: latest
path:
- results_2023-09-17T14-20-11.480532.parquet
---
# Dataset Card for Evaluation run of fireballoon/baichuan-vicuna-chinese-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fireballoon/baichuan-vicuna-chinese-7b](https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T14:20:11.480532](https://huggingface.co/datasets/open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b/blob/main/results_2023-09-17T14-20-11.480532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.22242030201342283,
"em_stderr": 0.00425891841660003,
"f1": 0.2740048238255038,
"f1_stderr": 0.004278992735739422,
"acc": 0.3619266227972807,
"acc_stderr": 0.00976430949757211
},
"harness|drop|3": {
"em": 0.22242030201342283,
"em_stderr": 0.00425891841660003,
"f1": 0.2740048238255038,
"f1_stderr": 0.004278992735739422
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179566
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964655
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Ransaka/youtube_recommendation_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 32119283.056155507
num_examples: 1041
- name: test
num_bytes: 10737281.943844492
num_examples: 348
download_size: 41663238
dataset_size: 42856565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Dataset licensed under : Creative Commons Attribution (CC BY) |
Berzerker/iiit5k_ocr_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: output_json_dumpsed
dtype: string
configs:
- config_name: default
data_files:
- split: train
path: data/*.parquet
language:
- en
--- |
Rimyy/problemMathGemma | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2763268
num_examples: 3500
download_size: 1216136
dataset_size: 2763268
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AnonymousPaperSubmissions/Training_CC | ---
license: mit
---
|
potradovec/html_prompts | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 29320.48581762802
num_examples: 80
- name: test
num_bytes: 7330.121454407005
num_examples: 20
download_size: 27897
dataset_size: 36650.60727203503
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
boapps/kmdb_people_classification | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: positive_people
sequence: string
- name: negative_people
sequence: string
splits:
- name: test
num_bytes: 3705659
num_examples: 750
- name: train
num_bytes: 58657685
num_examples: 11417
- name: validation
num_bytes: 7034843
num_examples: 1448
download_size: 41330238
dataset_size: 69398187
---
# Dataset Card for "kmdb_people_classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-econometrics-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 32655
num_examples: 114
download_size: 18213
dataset_size: 32655
---
# Dataset Card for "mmlu-econometrics-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shawt100/Shawt | ---
license: openrail
task_categories:
- text-generation
tags:
- art
pretty_name: ShawtSanders
--- |
thaiqa_squad | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- th
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-thaiqa
task_categories:
- question-answering
task_ids:
- extractive-qa
- open-domain-qa
paperswithcode_id: null
pretty_name: thaiqa-squad
dataset_info:
features:
- name: question_id
dtype: int32
- name: article_id
dtype: int32
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer
dtype: string
- name: answer_begin_position
dtype: int32
- name: answer_end_position
dtype: int32
config_name: thaiqa_squad
splits:
- name: train
num_bytes: 47905050
num_examples: 4000
- name: validation
num_bytes: 744813
num_examples: 74
download_size: 10003354
dataset_size: 48649863
---
# Dataset Card for `thaiqa-squad`
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://github.com/pythainlp/thaiqa_squad (original `thaiqa` at https://aiforthai.in.th/)
- **Repository:** http://github.com/pythainlp/thaiqa_squad
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**http://github.com/pythainlp/ (original `thaiqa` at https://aiforthai.in.th/)
### Dataset Summary
`thaiqa_squad` is an open-domain, extractive question answering dataset (4,000 questions in `train` and 74 questions in `dev`) in [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/) format, originally created by [NECTEC](https://www.nectec.or.th/en/) from Wikipedia articles and adapted to [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/) format by [PyThaiNLP](https://github.com/PyThaiNLP/).
### Supported Tasks and Leaderboards
extractive question answering
### Languages
Thai
## Dataset Structure
### Data Instances
```
{'answers': {'answer': ['ฮิกกิ้นส์'], 'answer_begin_position': [528], 'answer_end_position': [537]}, 'article_id': 115035, 'context': '<doc id="115035" url="https://th.wikipedia.org/wiki?curid=115035" title="เบนจี้">เบนจี้ เบนจี้ () เป็นชื่อตัวละครหมาพันทางแสนรู้ ที่ปรากฏอยู่ในภาพยนตร์หลายเรื่องที่เขียนบท และกำกับโดย โจ แคมป์ ในช่วงทศวรรษ 1970 ถึง 1980 ภาพยนตร์เรื่องแรกในชุด ใช้ชื่อเรื่องว่า เบนจี้ เช่นเดียวกับตัวละคร ถ่ายทำที่เมืองดัลลัส รัฐเทกซัส ฉายครั้งแรกในปี พ.ศ. 2517 ภาพยนตร์ได้รับการเสนอชื่อเข้าชิงรางวัลออสการ์ และได้รางวัลลูกโลกทองคำ สาขาเพลงประกอบยอดเยี่ยม จากเพลง Benji\'s Theme (I Feel Love) ร้องโดย ชาร์ลี ริช หมาที่แสดงเป็นเบนจี้ตัวแรก ชื่อว่า ฮิกกิ้นส์ (พ.ศ. 2502 - พ.ศ. 2518) มีอายุถึง 15 ปีแล้วในขณะแสดง หลังจากภาพยนตร์ออกฉายได้ไม่นาน มันก็ตายในปี พ.ศ. 2518เบนจี้ในภาพยนตร์เบนจี้ในภาพยนตร์. - พ.ศ. 2517, Benji (ภาพยนตร์) - พ.ศ. 2520, For the Love of Benji (ภาพยนตร์) - พ.ศ. 2521, Benji\'s Very Own Christmas Story (ภาพยนตร์โทรทัศน์) - พ.ศ. 2523, Oh Heavenly Dog (ภาพยนตร์) - พ.ศ. 2523, Benji at Work (ภาพยนตร์โทรทัศน์) - พ.ศ. 2524, Benji Takes a Dive at Marineland (ภาพยนตร์โทรทัศน์) - พ.ศ. 2526, Benji, Zax & the Alien Prince (ภาพยนตร์ซีรีส์) - พ.ศ. 2530, Benji the Hunted (ภาพยนตร์) - พ.ศ. 2547, Benji: Off the Leash! (ภาพยนตร์) - พ.ศ. 2550, Benji: The Barkening (ภาพยนตร์)</doc>\n', 'question': 'สุนัขตัวแรกรับบทเป็นเบนจี้ในภาพยนตร์เรื่อง Benji ที่ออกฉายในปี พ.ศ. 2517 มีชื่อว่าอะไร', 'question_id': 1}
{'answers': {'answer': ['ชาร์ลี ริช'], 'answer_begin_position': [482], 'answer_end_position': [492]}, 'article_id': 115035, 'context': '<doc id="115035" url="https://th.wikipedia.org/wiki?curid=115035" title="เบนจี้">เบนจี้ เบนจี้ () เป็นชื่อตัวละครหมาพันทางแสนรู้ ที่ปรากฏอยู่ในภาพยนตร์หลายเรื่องที่เขียนบท และกำกับโดย โจ แคมป์ ในช่วงทศวรรษ 1970 ถึง 1980 ภาพยนตร์เรื่องแรกในชุด ใช้ชื่อเรื่องว่า เบนจี้ เช่นเดียวกับตัวละคร ถ่ายทำที่เมืองดัลลัส รัฐเทกซัส ฉายครั้งแรกในปี พ.ศ. 2517 ภาพยนตร์ได้รับการเสนอชื่อเข้าชิงรางวัลออสการ์ และได้รางวัลลูกโลกทองคำ สาขาเพลงประกอบยอดเยี่ยม จากเพลง Benji\'s Theme (I Feel Love) ร้องโดย ชาร์ลี ริช หมาที่แสดงเป็นเบนจี้ตัวแรก ชื่อว่า ฮิกกิ้นส์ (พ.ศ. 2502 - พ.ศ. 2518) มีอายุถึง 15 ปีแล้วในขณะแสดง หลังจากภาพยนตร์ออกฉายได้ไม่นาน มันก็ตายในปี พ.ศ. 2518เบนจี้ในภาพยนตร์เบนจี้ในภาพยนตร์. - พ.ศ. 2517, Benji (ภาพยนตร์) - พ.ศ. 2520, For the Love of Benji (ภาพยนตร์) - พ.ศ. 2521, Benji\'s Very Own Christmas Story (ภาพยนตร์โทรทัศน์) - พ.ศ. 2523, Oh Heavenly Dog (ภาพยนตร์) - พ.ศ. 2523, Benji at Work (ภาพยนตร์โทรทัศน์) - พ.ศ. 2524, Benji Takes a Dive at Marineland (ภาพยนตร์โทรทัศน์) - พ.ศ. 2526, Benji, Zax & the Alien Prince (ภาพยนตร์ซีรีส์) - พ.ศ. 2530, Benji the Hunted (ภาพยนตร์) - พ.ศ. 2547, Benji: Off the Leash! (ภาพยนตร์) - พ.ศ. 2550, Benji: The Barkening (ภาพยนตร์)</doc>\n', 'question': "เพลง Benji's Theme ใช้ประกอบภาพยนตร์เรื่อง Benji ในปีพ.ศ. 2517 ขับร้องโดยใคร", 'question_id': 2035}
```
### Data Fields
```
{
"question_id": question id
"article_id": article id
"context": article texts
"question": question
"answers":
{
"answer": answer text
"answer_begin_position": answer beginning position
"answer_end_position": answer exclusive upper bound position
}
),
}
```
### Data Splits
| | train | valid |
|-------------------------|-------------|-------------|
| # questions | 4000 | 74 |
| # avg words in context | 1186.740750 | 1016.459459 |
| # avg words in question | 14.325500 | 12.743243 |
| # avg words in answer | 3.279750 | 4.608108 |
## Dataset Creation
### Curation Rationale
[PyThaiNLP](https://github.com/PyThaiNLP/) created `thaiqa_squad` as a [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/) version of [thaiqa](http://copycatch.in.th/thai-qa-task.html). [thaiqa](https://aiforthai.in.th/corpus.php) is part of [The 2nd Question answering program from Thai Wikipedia](http://copycatch.in.th/thai-qa-task.html) of [National Software Contest 2020](http://nsc.siit.tu.ac.th/GENA2/login.php).
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Wikipedia authors for contexts and [NECTEC](https://www.nectec.or.th/en/) for questions and answer annotations
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[NECTEC](https://www.nectec.or.th/en/)
### Personal and Sensitive Information
All contents are from Wikipedia. No personal and sensitive information is expected to be included.
## Considerations for Using the Data
### Social Impact of Dataset
- open-domain, extractive question answering in Thai
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Dataset provided for research purposes only. Please check dataset license for additional information.
The contexts include `<doc>` tags at start and at the end
## Additional Information
### Dataset Curators
[NECTEC](https://www.nectec.or.th/en/) for original [thaiqa](https://aiforthai.in.th/corpus.php). SQuAD formattting by [PyThaiNLP](https://github.com/PyThaiNLP/).
### Licensing Information
CC-BY-NC-SA 3.0
### Citation Information
No clear citation guidelines from source: https://aiforthai.in.th/corpus.php
SQuAD version: https://github.com/PyThaiNLP/thaiqa_squad
### Contributions
Thanks to [@cstorm125](https://github.com/cstorm125) for adding this dataset. |
id_clickbait | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- id
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- fact-checking
pretty_name: Indonesian Clickbait Headlines
dataset_info:
- config_name: annotated
features:
- name: id
dtype: string
- name: title
dtype: string
- name: label
dtype:
class_label:
names:
'0': non-clickbait
'1': clickbait
splits:
- name: train
num_bytes: 1268698
num_examples: 15000
download_size: 150769127
dataset_size: 1268698
- config_name: raw
features:
- name: id
dtype: string
- name: title
dtype: string
- name: source
dtype: string
- name: date
dtype: string
- name: category
dtype: string
- name: sub-category
dtype: string
- name: content
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 81669386
num_examples: 38655
download_size: 150769127
dataset_size: 81669386
---
# Dataset Card for Indonesian Clickbait Headlines
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://data.mendeley.com/datasets/k42j7x2kpn/1
- **Repository:**
- **Paper:** [CLICK-ID: A Novel Dataset for Indonesian Clickbait Headlines](https://www.sciencedirect.com/science/article/pii/S2352340920311252#!)
- **Leaderboard:**
- **Point of Contact:** [Andika William](mailto:andika.william@mail.ugm.ac.id), [Yunita Sari](mailto:yunita.sari@ugm.ac.id)
### Dataset Summary
The CLICK-ID dataset is a collection of Indonesian news headlines that was collected from 12 local online news
publishers; detikNews, Fimela, Kapanlagi, Kompas, Liputan6, Okezone, Posmetro-Medan, Republika, Sindonews, Tempo,
Tribunnews, and Wowkeren. This dataset is comprised of mainly two parts; (i) 46,119 raw article data, and (ii)
15,000 clickbait annotated sample headlines. Annotation was conducted with 3 annotator examining each headline.
Judgment were based only on the headline. The majority then is considered as the ground truth. In the annotated
sample, our annotation shows 6,290 clickbait and 8,710 non-clickbait.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Indonesian
## Dataset Structure
### Data Instances
An example of the annotated article:
```
{
'id': '100',
'label': 1,
'title': "SAH! Ini Daftar Nama Menteri Kabinet Jokowi - Ma'ruf Amin"
}
>
```
### Data Fields
#### Annotated
- `id`: id of the sample
- `title`: the title of the news article
- `label`: the label of the article, either non-clickbait or clickbait
#### Raw
- `id`: id of the sample
- `title`: the title of the news article
- `source`: the name of the publisher/newspaper
- `date`: date
- `category`: the category of the article
- `sub-category`: the sub category of the article
- `content`: the content of the article
- `url`: the url of the article
### Data Splits
The dataset contains train set.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Creative Commons Attribution 4.0 International license
### Citation Information
```
@article{WILLIAM2020106231,
title = "CLICK-ID: A novel dataset for Indonesian clickbait headlines",
journal = "Data in Brief",
volume = "32",
pages = "106231",
year = "2020",
issn = "2352-3409",
doi = "https://doi.org/10.1016/j.dib.2020.106231",
url = "http://www.sciencedirect.com/science/article/pii/S2352340920311252",
author = "Andika William and Yunita Sari",
keywords = "Indonesian, Natural Language Processing, News articles, Clickbait, Text-classification",
abstract = "News analysis is a popular task in Natural Language Processing (NLP). In particular, the problem of clickbait in news analysis has gained attention in recent years [1, 2]. However, the majority of the tasks has been focused on English news, in which there is already a rich representative resource. For other languages, such as Indonesian, there is still a lack of resource for clickbait tasks. Therefore, we introduce the CLICK-ID dataset of Indonesian news headlines extracted from 12 Indonesian online news publishers. It is comprised of 15,000 annotated headlines with clickbait and non-clickbait labels. Using the CLICK-ID dataset, we then developed an Indonesian clickbait classification model achieving favourable performance. We believe that this corpus will be useful for replicable experiments in clickbait detection or other experiments in NLP areas."
}
```
### Contributions
Thanks to [@cahya-wirawan](https://github.com/cahya-wirawan) for adding this dataset.
|
welteny/CHACAL | ---
license: openrail
---
|
Akajackson/hand_open | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 285662808.768
num_examples: 72284
- name: validation
num_bytes: 11469570.504
num_examples: 1544
download_size: 348851570
dataset_size: 297132379.272
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
MaxP/agro_riego | ---
license: unknown
---
|
steve1989/financial_news_headlines | ---
dataset_info:
features:
- name: Headlines
dtype: string
- name: sentiment_label
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1544963
num_examples: 13031
- name: validation
num_bytes: 328126
num_examples: 2793
- name: test
num_bytes: 331003
num_examples: 2793
download_size: 1258323
dataset_size: 2204092
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
CVasNLPExperiments/VQAv2_minival_no_image_google_flan_t5_xl_mode_T_A_D_PNP_FILTER_Q_rices_ns_25994 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_random_
num_bytes: 3687274
num_examples: 25994
download_size: 1321705
dataset_size: 3687274
---
# Dataset Card for "VQAv2_minival_no_image_google_flan_t5_xl_mode_T_A_D_PNP_FILTER_Q_rices_ns_25994"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/gtzan_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 48069680
num_examples: 1000
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 48069680
num_examples: 1000
- name: academicodec_hifi_24k_320d
num_bytes: 72069680
num_examples: 1000
- name: audiodec_24k_320d
num_bytes: 153685680
num_examples: 1000
- name: dac_16k
num_bytes: 293861680
num_examples: 1000
- name: dac_24k
num_bytes: 815797680
num_examples: 1000
- name: dac_44k
num_bytes: 243737680
num_examples: 1000
- name: encodec_24k
num_bytes: 36077680
num_examples: 1000
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 384437680
num_examples: 1000
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 384437680
num_examples: 1000
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 384437680
num_examples: 1000
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 192437680
num_examples: 1000
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 384437680
num_examples: 1000
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 384437680
num_examples: 1000
- name: speech_tokenizer_16k
num_bytes: 96149680
num_examples: 1000
download_size: 620118689
dataset_size: 3922145200
---
# Dataset Card for "gtzan_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pravsels/manim-physics_Matheart_issues | ---
dataset_info:
features:
- name: number
dtype: int64
- name: content
dtype: string
- name: comments
sequence: string
splits:
- name: train
num_bytes: 17964
num_examples: 40
download_size: 10214
dataset_size: 17964
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/istella22_test_fold1 | ---
pretty_name: '`istella22/test/fold1`'
viewer: false
source_datasets: ['irds/istella22']
task_categories:
- text-retrieval
---
# Dataset Card for `istella22/test/fold1`
The `istella22/test/fold1` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/istella22#istella22/test/fold1).
# Data
This dataset provides:
- `queries` (i.e., topics); count=440
- `qrels`: (relevance assessments); count=2,164
- For `docs`, use [`irds/istella22`](https://huggingface.co/datasets/irds/istella22)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/istella22_test_fold1', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/istella22_test_fold1', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
carnival13/rbrt_test_val_lrg3 | ---
dataset_info:
features:
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 148079605
num_examples: 104550
download_size: 32715970
dataset_size: 148079605
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rbrt_test_val_lrg3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Saviourscs/Resarch_anjali | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 654808
num_examples: 460
download_size: 362324
dataset_size: 654808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
XTer123/AudioLabelingTestDataset | ---
license: mit
---
|
ivomaioli/NevioZanette | ---
license: apache-2.0
---
|
tyzhu/squad_qa_baseline_v5_full | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2496440
num_examples: 2385
- name: validation
num_bytes: 335684
num_examples: 300
download_size: 0
dataset_size: 2832124
---
# Dataset Card for "squad_qa_baseline_v5_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cohere/miracl-zh-corpus-22-12 | ---
annotations_creators:
- expert-generated
language:
- zh
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# MIRACL (zh) embedded with cohere.ai `multilingual-22-12` encoder
We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
The query embeddings can be found in [Cohere/miracl-zh-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-zh-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-zh-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-zh-corpus-22-12).
For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus).
Dataset info:
> MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world.
>
> The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage.
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Loading the dataset
In [miracl-zh-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-zh-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large.
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-zh-corpus-22-12", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-zh-corpus-22-12", split="train", streaming=True)
for doc in docs:
docid = doc['docid']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
Have a look at [miracl-zh-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-zh-queries-22-12) where we provide the query embeddings for the MIRACL dataset.
To search in the documents, you must use **dot-product**.
And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product.
A full search example:
```python
# Attention! For large datasets, this requires a lot of memory to store
# all document embeddings and to compute the dot product scores.
# Only use this for smaller datasets. For large datasets, use a vector DB
from datasets import load_dataset
import torch
#Load documents + embeddings
docs = load_dataset(f"Cohere/miracl-zh-corpus-22-12", split="train")
doc_embeddings = torch.tensor(docs['emb'])
# Load queries
queries = load_dataset(f"Cohere/miracl-zh-queries-22-12", split="dev")
# Select the first query as example
qid = 0
query = queries[qid]
query_embedding = torch.tensor(queries['emb'])
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query['query'])
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
```
You can get embeddings for new queries using our API:
```python
#Run: pip install cohere
import cohere
co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :))
texts = ['my search query']
response = co.embed(texts=texts, model='multilingual-22-12')
query_embedding = response.embeddings[0] # Get the embedding for the first text
```
## Performance
In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset.
We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results.
Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted.
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 |
|---|---|---|---|---|
| miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 |
| miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 |
| miracl-de | 44.4 | 60.7 | 19.6 | 29.8 |
| miracl-en | 44.6 | 62.2 | 30.2 | 43.2 |
| miracl-es | 47.0 | 74.1 | 27.0 | 47.2 |
| miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 |
| miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 |
| miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 |
| miracl-id | 44.8 | 63.8 | 39.2 | 54.7 |
| miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 |
| **Avg** | 51.7 | 67.5 | 34.7 | 46.0 |
Further languages (not supported by Elasticsearch):
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 |
|---|---|---|
| miracl-fa | 44.8 | 53.6 |
| miracl-ja | 49.0 | 61.0 |
| miracl-ko | 50.9 | 64.8 |
| miracl-sw | 61.4 | 74.5 |
| miracl-te | 67.8 | 72.3 |
| miracl-th | 60.2 | 71.9 |
| miracl-yo | 56.4 | 62.2 |
| miracl-zh | 43.8 | 56.5 |
| **Avg** | 54.3 | 64.6 |
|
zakria/test_cs_done | ---
dataset_info:
features:
- name: audio_file_path
dtype: string
- name: cs_sentence
dtype: string
splits:
- name: train
num_bytes: 702
num_examples: 3
download_size: 3073
dataset_size: 702
---
# Dataset Card for "test_cs_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quan246/half_doc_news_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 277928
num_examples: 1000
- name: dev
num_bytes: 26405
num_examples: 100
download_size: 178400
dataset_size: 304333
---
# Dataset Card for "half_doc_news_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T04:59:45.337848](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down/blob/main/results_2023-10-28T04-59-45.337848.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.21780620805369127,\n\
\ \"em_stderr\": 0.004226997205914972,\n \"f1\": 0.26442428691275155,\n\
\ \"f1_stderr\": 0.004202345101166552,\n \"acc\": 0.4478647632531162,\n\
\ \"acc_stderr\": 0.010273220250493906\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.21780620805369127,\n \"em_stderr\": 0.004226997205914972,\n\
\ \"f1\": 0.26442428691275155,\n \"f1_stderr\": 0.004202345101166552\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \
\ \"acc_stderr\": 0.00887033125649\n },\n \"harness|winogrande|5\": {\n\
\ \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T04_59_45.337848
path:
- '**/details_harness|drop|3_2023-10-28T04-59-45.337848.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T04-59-45.337848.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T04_59_45.337848
path:
- '**/details_harness|gsm8k|5_2023-10-28T04-59-45.337848.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T04-59-45.337848.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T04_59_45.337848
path:
- '**/details_harness|winogrande|5_2023-10-28T04-59-45.337848.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T04-59-45.337848.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- results_2023-10-01T14-35-45.846264.parquet
- split: 2023_10_28T04_59_45.337848
path:
- results_2023-10-28T04-59-45.337848.parquet
- split: latest
path:
- results_2023-10-28T04-59-45.337848.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T04:59:45.337848](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down/blob/main/results_2023-10-28T04-59-45.337848.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.21780620805369127,
"em_stderr": 0.004226997205914972,
"f1": 0.26442428691275155,
"f1_stderr": 0.004202345101166552,
"acc": 0.4478647632531162,
"acc_stderr": 0.010273220250493906
},
"harness|drop|3": {
"em": 0.21780620805369127,
"em_stderr": 0.004226997205914972,
"f1": 0.26442428691275155,
"f1_stderr": 0.004202345101166552
},
"harness|gsm8k|5": {
"acc": 0.11751326762699014,
"acc_stderr": 0.00887033125649
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zolak/twitter_dataset_81_1713149981 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 298612
num_examples: 781
download_size: 155512
dataset_size: 298612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MoCe/optimized-sd-config | ---
license: cc-by-nc-sa-4.0
---
|
aimagic/big5essay | ---
license: mit
---
|
james-burton/data_scientist_salary_all_text | ---
dataset_info:
features:
- name: experience
dtype: string
- name: job_description
dtype: string
- name: job_desig
dtype: string
- name: job_type
dtype: string
- name: key_skills
dtype: string
- name: location
dtype: string
- name: salary
dtype: int64
splits:
- name: train
num_bytes: 3256981
num_examples: 13464
- name: validation
num_bytes: 581453
num_examples: 2377
- name: test
num_bytes: 956394
num_examples: 3961
download_size: 2195043
dataset_size: 4794828
---
# Dataset Card for "data_scientist_salary_all_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rbeauchamp/augmented_images_40_tokens | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: image_path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 221430807.0
num_examples: 222
download_size: 220307366
dataset_size: 221430807.0
---
# Dataset Card for "augmented_images_40_tokens"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_quantumaikr__quantum-dpo-v0.1 | ---
pretty_name: Evaluation run of quantumaikr/quantum-dpo-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/quantum-dpo-v0.1](https://huggingface.co/quantumaikr/quantum-dpo-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__quantum-dpo-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T08:25:35.133410](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantum-dpo-v0.1/blob/main/results_2023-12-18T08-25-35.133410.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6583441513997987,\n\
\ \"acc_stderr\": 0.032026908469625553,\n \"acc_norm\": 0.6580775139828253,\n\
\ \"acc_norm_stderr\": 0.0326896820470086,\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6991731296636601,\n\
\ \"mc2_stderr\": 0.014996767395410802\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6962457337883959,\n \"acc_stderr\": 0.01343890918477876,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.713802031467835,\n\
\ \"acc_stderr\": 0.004510593395289895,\n \"acc_norm\": 0.8836885082652858,\n\
\ \"acc_norm_stderr\": 0.003199428675985866\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n\
\ \"acc_stderr\": 0.016709709877661995,\n \"acc_norm\": 0.48044692737430167,\n\
\ \"acc_norm_stderr\": 0.016709709877661995\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6991731296636601,\n\
\ \"mc2_stderr\": 0.014996767395410802\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918753\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \
\ \"acc_stderr\": 0.012522795894420867\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/quantum-dpo-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-25-35.133410.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-25-35.133410.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- '**/details_harness|winogrande|5_2023-12-18T08-25-35.133410.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T08-25-35.133410.parquet'
- config_name: results
data_files:
- split: 2023_12_18T08_25_35.133410
path:
- results_2023-12-18T08-25-35.133410.parquet
- split: latest
path:
- results_2023-12-18T08-25-35.133410.parquet
---
# Dataset Card for Evaluation run of quantumaikr/quantum-dpo-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [quantumaikr/quantum-dpo-v0.1](https://huggingface.co/quantumaikr/quantum-dpo-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__quantum-dpo-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T08:25:35.133410](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__quantum-dpo-v0.1/blob/main/results_2023-12-18T08-25-35.133410.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6583441513997987,
"acc_stderr": 0.032026908469625553,
"acc_norm": 0.6580775139828253,
"acc_norm_stderr": 0.0326896820470086,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6991731296636601,
"mc2_stderr": 0.014996767395410802
},
"harness|arc:challenge|25": {
"acc": 0.6962457337883959,
"acc_stderr": 0.01343890918477876,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.713802031467835,
"acc_stderr": 0.004510593395289895,
"acc_norm": 0.8836885082652858,
"acc_norm_stderr": 0.003199428675985866
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877661995,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877661995
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6991731296636601,
"mc2_stderr": 0.014996767395410802
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918753
},
"harness|gsm8k|5": {
"acc": 0.7081122062168309,
"acc_stderr": 0.012522795894420867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pandyaved98/projectdata | ---
license: apache-2.0
---
|
heliosprime/twitter_dataset_1713199066 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 12047
num_examples: 33
download_size: 14657
dataset_size: 12047
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713199066"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713047141 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2372417
num_examples: 7393
download_size: 1335943
dataset_size: 2372417
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xxl_mode_T_CM_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 141841
num_examples: 1000
download_size: 53617
dataset_size: 141841
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xxl_mode_T_CM_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GabrielVidal/dead-by-daylight-perks | ---
license: openrail
dataset_info:
features:
- name: image
dtype: image
- name: name
dtype: string
- name: type
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 22392351.0
num_examples: 219
download_size: 22365600
dataset_size: 22392351.0
annotations_creators:
- found
language:
- en
language_creators:
- found
multilinguality:
- monolingual
pretty_name: Dead by daylight video game perks
size_categories:
- n<1K
source_datasets:
- original
tags:
- dead by daylight
task_categories:
- image-classification
- text-to-image
task_ids:
- multi-class-image-classification
---
# Dataset Card for Dead by Daylight perks
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Contributions](#contributions)
### Dataset Summary
This dataset contains all images (on black background and upscaled to 512x512) of perks from the video game [Dead by Daylight](https://deadbydaylight.com/) with type, name and description (the first sentence) in english.
## Dataset Creation
### Source Data
All images and text have been found online, mainly on the [Dead by Daylight wiki](https://deadbydaylight.fandom.com/wiki/Dead_by_Daylight_Wiki).
## Additional Information
### Licensing Information
All images belong to [Dead by Daylight](https://deadbydaylight.com/).
### Contributions
Thanks to [@GabrielVidal1](https://github.com/GabrielVidal1) for adding this dataset. |
commaai/commabody | ---
license: mit
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/608c79739c5a8f8ddd85c409/5OV6RfqSKycPUuI0rm34m.png" width="500">
# comma body
A dataset of indoor navigation with the comma body.
# description
This dataset is consists of 69 segments (1 min chunks) of video data compressed using H.265 and sensor measurements and logs from openpilot.
# how to use
- videos: use `openpilot.tools.lib.framereader` or your favorite video decoder
- logs: use `openpilot.tools.lib.logreader` or `PlotJuggler`
# have fun!
|
arbml/masader | ---
license: mit
---
|
CyberHarem/dorothy_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dorothy/ドロシー/多萝西 (Arknights)
This is the dataset of dorothy/ドロシー/多萝西 (Arknights), containing 281 images and their tags.
The core tags of this character are `animal_ears, long_hair, blonde_hair, mouse_ears, yellow_eyes, breasts, mouse_girl, hat, white_headwear, hair_between_eyes, medium_breasts, very_long_hair, large_breasts, low_ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 281 | 600.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 281 | 478.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 747 | 962.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dorothy_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, black_dress, black_gloves, cleavage, pearl_necklace, solo, official_alternate_costume, veil, looking_at_viewer, juliet_sleeves, parted_lips, cowboy_shot, elbow_gloves |
| 1 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, veil, black_dress, black_gloves, pearl_necklace, white_background, cleavage, elbow_gloves, hands_up, simple_background, upper_body, parted_lips |
| 2 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_skirt, cleavage, cowboy_shot, mismatched_gloves, open_clothes, solo, white_coat, yellow_gloves, looking_at_viewer, single_glove, thigh_strap, white_shirt, off_shoulder, single_fingerless_glove, tail, jacket, long_sleeves, simple_background, white_background |
| 3 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, off_shoulder, white_coat, yellow_gloves, simple_background, white_shirt, cleavage_cutout, cowboy_shot, long_sleeves, smile, white_background, black_skirt, closed_mouth, open_coat, asymmetrical_gloves, holding |
| 4 | 10 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, cleavage, closed_mouth, simple_background, smile, upper_body, white_background, yellow_gloves, blush, off_shoulder, shirt |
| 5 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, closed_mouth, looking_at_viewer, off_shoulder, solo, upper_body, simple_background, smile, white_background, white_jacket, yellow_gloves, long_sleeves, open_clothes, white_shirt, beret, hand_up |
| 6 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, open_mouth, solo_focus, blush, completely_nude, cowgirl_position, crossed_bangs, girl_on_top, looking_at_viewer, navel, nipples, pov, simple_background, sweat, :d, collarbone, heart-shaped_pupils, looking_back, mouse_tail, sex_from_behind, speech_bubble, spoken_heart, spread_legs, tongue_out, vaginal, white_background |
| 7 | 6 |  |  |  |  |  | 1girl, hetero, nipples, solo_focus, 1boy, blush, mouse_tail, open_mouth, black_thighhighs, cum, heart_censor, penis, simple_background, tongue_out, ahegao, breasts_out, gradient_background, lactation, nude, pussy, saliva, sex |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | black_gloves | cleavage | pearl_necklace | solo | official_alternate_costume | veil | looking_at_viewer | juliet_sleeves | parted_lips | cowboy_shot | elbow_gloves | white_background | hands_up | simple_background | upper_body | bare_shoulders | black_skirt | mismatched_gloves | open_clothes | white_coat | yellow_gloves | single_glove | thigh_strap | white_shirt | off_shoulder | single_fingerless_glove | tail | jacket | long_sleeves | cleavage_cutout | smile | closed_mouth | open_coat | asymmetrical_gloves | holding | blush | shirt | white_jacket | beret | hand_up | 1boy | hetero | open_mouth | solo_focus | completely_nude | cowgirl_position | crossed_bangs | girl_on_top | navel | nipples | pov | sweat | :d | collarbone | heart-shaped_pupils | looking_back | mouse_tail | sex_from_behind | speech_bubble | spoken_heart | spread_legs | tongue_out | vaginal | black_thighhighs | cum | heart_censor | penis | ahegao | breasts_out | gradient_background | lactation | nude | pussy | saliva | sex |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:---------------|:-----------|:-----------------|:-------|:-----------------------------|:-------|:--------------------|:-----------------|:--------------|:--------------|:---------------|:-------------------|:-----------|:--------------------|:-------------|:-----------------|:--------------|:--------------------|:---------------|:-------------|:----------------|:---------------|:--------------|:--------------|:---------------|:--------------------------|:-------|:---------|:---------------|:------------------|:--------|:---------------|:------------|:----------------------|:----------|:--------|:--------|:---------------|:--------|:----------|:-------|:---------|:-------------|:-------------|:------------------|:-------------------|:----------------|:--------------|:--------|:----------|:------|:--------|:-----|:-------------|:----------------------|:---------------|:-------------|:------------------|:----------------|:---------------|:--------------|:-------------|:----------|:-------------------|:------|:---------------|:--------|:---------|:--------------|:----------------------|:------------|:-------|:--------|:---------|:------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | X | | X | | | X | | | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | | | | | X | | | X | | | X | | X | | X | | X | X | | | X | X | | | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | X | | X | | | X | | | | | X | | X | X | X | | | | | X | | | | X | | | | | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | X | | | X | | | | | X | | X | X | X | | | X | | X | | | X | X | | | | X | | X | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | | | | | | X | | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X |
|
hynky/czech_news_dataset_v2 | ---
dataset_info:
features:
- name: url
dtype: string
- name: authors
sequence: string
- name: headline
dtype: string
- name: brief
dtype: string
- name: keywords
sequence: string
- name: category
dtype:
class_label:
names:
'0': None
'1': Zahraniční
'2': Domácí
'3': Sport
'4': Kultura
'5': Revue
'6': Koktejl
'7': Ekonomika
'8': Krimi
'9': Podnikání
'10': Auto
'11': Věda
'12': Komentáře
'13': Cestování
'14': Finance
'15': Technologie
'16': Bydlení
'17': Koronavirus
'18': Byznys
'19': Rozhovory
'20': Podcasty
'21': Životní styl
'22': Literatura
'23': Vánoce
'24': Výtvarné umění
'25': Kolo
- name: content
dtype: large_string
- name: comments_num
dtype: int32
- name: server
dtype:
class_label:
names:
'0': None
'1': seznamzpravy
'2': idnes
'3': aktualne
'4': novinky
'5': denik
'6': irozhlas
- name: category_unclean
dtype: string
- name: authors_gender
sequence:
class_label:
names:
'0': None
'1': MAN
'2': WOMAN
- name: authors_cum_gender
dtype:
class_label:
names:
'0': None
'1': MAN
'2': WOMAN
'3': MIXED
- name: day_of_week
dtype:
class_label:
names:
'0': None
'1': MONDAY
'2': TUESDAY
'3': WEDNESDAY
'4': THURSDAY
'5': FRIDAY
'6': SATURDAY
'7': SUNDAY
- name: date
dtype: timestamp[us]
splits:
- name: train
num_bytes: 5140455719
num_examples: 1641471
- name: validation
num_bytes: 529659351
num_examples: 144836
- name: test
num_bytes: 553381092
num_examples: 144837
download_size: 4133635457
dataset_size: 6223496162
license: cc0-1.0
task_categories:
- text-classification
- summarization
language:
- cs
tags:
- news
pretty_name: CZE-NEC
size_categories:
- 1M<n<10M
---
# Dataset Card for "czech_news_dataset_v2"
- Dataset containing the news articles from major online news outlets collected from 2000-2022.
- Follow-up paper https://arxiv.org/abs/2307.10666 (v1 of the dataset)
- Changes from v1
- Better contribution of novinky.cz in later stages
- More articles, as a mistake in filtering was fixed.
- Collection was done using CmonCrawl.
- The dataset should be used for Research only purposes as I don't have rights for articles itself.
- If you have any question contact me at kydlicek.hynek@gmail.com |
GEM-submissions/lewtun__this-is-a-test-name__1655913671 | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name
|
mtkinit/AI4Copernicus-SmallSentimentAnalysis | ---
pretty_name: AI4Copernicus-SmallSentimentAnalysis
---
# AI4Copernicus-SmallSentimentAnalysis
Created from AIOD platform |
dmrau/cqadubstack-wordpress-qrels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 19885
num_examples: 744
download_size: 11490
dataset_size: 19885
---
# Dataset Card for "cqadubstack-wordpress-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
avsolatorio/mteb-banking77-avs_triplets | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
- name: idx
dtype: int64
- name: query_idx
dtype: int64
- name: positive_idx
dtype: int64
- name: negative_idx
dtype: int64
splits:
- name: train
num_bytes: 1286431
num_examples: 10003
download_size: 612460
dataset_size: 1286431
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# MTEB Banking77 Triplets Dataset
This dataset was used in the paper GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning. Refer to https://arxiv.org/abs/2402.16829 for details.
The code for generating the data is available at https://github.com/avsolatorio/GISTEmbed/blob/main/scripts/create_classification_dataset.py.
## Citation
```
@article{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
journal={arXiv preprint arXiv:2402.16829},
year={2024},
URL={https://arxiv.org/abs/2402.16829}
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
Sree1994/ddb_baseprompts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Base_prompt
dtype: string
- name: Prompt
dtype: string
splits:
- name: train
num_bytes: 14886028
num_examples: 51602
- name: test
num_bytes: 2096918
num_examples: 7299
- name: valid
num_bytes: 4301342
num_examples: 14817
download_size: 10829614
dataset_size: 21284288
---
# Dataset Card for "ddb_baseprompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aggshourya/anime_demo_testing2 | ---
license: openrail
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 263289.0
num_examples: 8
download_size: 263173
dataset_size: 263289.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nhantruongcse/130_test_vit5_base | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 97793645
num_examples: 12990
download_size: 43120373
dataset_size: 97793645
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
K-Monitor/kmdb_base | ---
language:
- hu
size_categories:
- 10K<n<100K
pretty_name: K-Monitor sajtóadatbázis
dataset_info:
features:
- name: news_id
dtype: int64
- name: title
dtype: string
- name: source_url
dtype: string
- name: kmdb_url
dtype: string
- name: newspaper
dtype: string
- name: pub_time
dtype: string
- name: authors
sequence: string
- name: text
dtype: string
- name: meta_keywords
sequence: string
- name: meta_description
dtype: string
- name: url
dtype: string
- name: kmonitor_title
dtype: string
- name: publish_date
dtype: string
- name: kmonitor_description
dtype: string
- name: kmonitor_places
sequence: string
- name: kmonitor_persons
sequence: string
- name: kmonitor_others
sequence: string
- name: kmonitor_institutions
sequence: string
splits:
- name: train
num_bytes: 243123503.81640086
num_examples: 46914
download_size: 140573780
dataset_size: 243123503.81640086
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Forrás: https://adatbazis.k-monitor.hu/
## Használat
```python
from datasets import load_dataset
dataset = load_dataset('K-Monitor/kmdb_base')
# írjuk ki egy cikk címét
print(dataset['train'][0]['title'])
```
## Oszlopok magyarázata
`news_id`: a cikk k-monitor-os azonosítója
`title`: hírportál által adott cím (hiányos lehet, lásd: kmonitor_title)
`source_url`: cikk url-je, ami hírportálra mutat
`kmdb_url`: a cikk k-monitor adatbázisba felvett rekordjára mutat (https://adatbazis.k-monitor.hu/hirek/...)
`newspaper`: a hírportál neve (a k-monitor adatbázisból származik)
`pub_time`: publikálás időpontja (a k-monitor adatbázisból származik)
`authors`: a cikk szerzői, ez a hír parsolása közben kinyert információ (lehet, hogy hibás, sokszor hiányzik)
`text`: cikk szövegének törzse, tisztítva (üres nem lehet, de előfordulhat, hogy nem teljes)
`meta_keywords`: hírportál címkék, ez is a hír parsolása közben kinyert információ (nem mindig van és nem mindig értelmes)
`meta_description`: hírportál által adott lead (kis bevezető/összefoglaló az elején)
`url`: a cikk url-je, normalizálva (http->https, www kiszedve, egyéb url paraméterek eltávolítva [kivétel pár hírportál esetén])
`kmonitor_title`: ugyanez címmel
`publish_date`: cikk megjelenésének ideje, változó pontossággal és ritkán értelmetlen
`kmonitor_description`: k-monitor adatbázisában tárolt leírás (sokszor `null`, néha egyezik a rendes leírással)
`kmonitor_places`: K-Monitoros önkéntesek által címkézett helyszínek
`kmonitor_persons`: K-Monitoros önkéntesek által címkézett személyek
`kmonitor_others`: K-Monitoros önkéntesek által címkézett egyéb kulcsszavak (nem feltétlen jelenik meg a cikkben az adott szó) |
Back-up/hate_speech | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: results
struct:
- name: dangerous_content
dtype: string
- name: harassment
dtype: string
- name: hate_speech
dtype: string
- name: normal
dtype: string
- name: sexaully_explixit
dtype: string
- name: prediction
sequence: string
splits:
- name: train
num_bytes: 86995.3784
num_examples: 298
download_size: 63062
dataset_size: 86995.3784
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-rl | ---
pretty_name: Evaluation run of deepseek-ai/deepseek-math-7b-rl
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [deepseek-ai/deepseek-math-7b-rl](https://huggingface.co/deepseek-ai/deepseek-math-7b-rl)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-rl\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T20:36:43.895843](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-rl/blob/main/results_2024-04-15T20-36-43.895843.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5602939981344383,\n\
\ \"acc_stderr\": 0.03441897837176519,\n \"acc_norm\": 0.5705247302143901,\n\
\ \"acc_norm_stderr\": 0.03524623166939266,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4036948798131458,\n\
\ \"mc2_stderr\": 0.015288577859532484\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298964,\n\
\ \"acc_norm\": 0.5546075085324232,\n \"acc_norm_stderr\": 0.014523987638344086\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5488946425014938,\n\
\ \"acc_stderr\": 0.0049658660983181715,\n \"acc_norm\": 0.724457279426409,\n\
\ \"acc_norm_stderr\": 0.004458742356237882\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739428,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739428\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6553191489361702,\n \"acc_stderr\": 0.031068985963122145,\n\
\ \"acc_norm\": 0.6553191489361702,\n \"acc_norm_stderr\": 0.031068985963122145\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.0402873153294756,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.0402873153294756\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.582010582010582,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\"\
: 0.582010582010582,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6387096774193548,\n \"acc_stderr\": 0.02732754844795754,\n \"\
acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.02732754844795754\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4222222222222222,\n \"acc_stderr\": 0.030114442019668092,\n \
\ \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.030114442019668092\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.040428099613956346,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.040428099613956346\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.01877605231961963,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.01877605231961963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5588235294117647,\n \"acc_stderr\": 0.034849415144292316,\n \"\
acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.034849415144292316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990414,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990414\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.016740929047162692,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.016740929047162692\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5520231213872833,\n \"acc_stderr\": 0.02677299065336182,\n\
\ \"acc_norm\": 0.5520231213872833,\n \"acc_norm_stderr\": 0.02677299065336182\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32625698324022345,\n\
\ \"acc_stderr\": 0.01568044151888918,\n \"acc_norm\": 0.32625698324022345,\n\
\ \"acc_norm_stderr\": 0.01568044151888918\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.027767689606833932,\n\
\ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.027767689606833932\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970472,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39048239895697523,\n\
\ \"acc_stderr\": 0.012460135913945078,\n \"acc_norm\": 0.39048239895697523,\n\
\ \"acc_norm_stderr\": 0.012460135913945078\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714867,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714867\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.03038726291954773,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.03038726291954773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.038057975055904594,\n\
\ \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.038057975055904594\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4036948798131458,\n\
\ \"mc2_stderr\": 0.015288577859532484\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6677190213101816,\n \"acc_stderr\": 0.013238316554236528\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \
\ \"acc_stderr\": 0.006216328640238133\n }\n}\n```"
repo_url: https://huggingface.co/deepseek-ai/deepseek-math-7b-rl
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-36-43.895843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-36-43.895843.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- '**/details_harness|winogrande|5_2024-04-15T20-36-43.895843.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T20-36-43.895843.parquet'
- config_name: results
data_files:
- split: 2024_04_15T20_36_43.895843
path:
- results_2024-04-15T20-36-43.895843.parquet
- split: latest
path:
- results_2024-04-15T20-36-43.895843.parquet
---
# Dataset Card for Evaluation run of deepseek-ai/deepseek-math-7b-rl
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-math-7b-rl](https://huggingface.co/deepseek-ai/deepseek-math-7b-rl) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-rl",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T20:36:43.895843](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-rl/blob/main/results_2024-04-15T20-36-43.895843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5602939981344383,
"acc_stderr": 0.03441897837176519,
"acc_norm": 0.5705247302143901,
"acc_norm_stderr": 0.03524623166939266,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4036948798131458,
"mc2_stderr": 0.015288577859532484
},
"harness|arc:challenge|25": {
"acc": 0.5255972696245734,
"acc_stderr": 0.014592230885298964,
"acc_norm": 0.5546075085324232,
"acc_norm_stderr": 0.014523987638344086
},
"harness|hellaswag|10": {
"acc": 0.5488946425014938,
"acc_stderr": 0.0049658660983181715,
"acc_norm": 0.724457279426409,
"acc_norm_stderr": 0.004458742356237882
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739428,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739428
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6553191489361702,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.6553191489361702,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.0402873153294756,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.0402873153294756
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.582010582010582,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.582010582010582,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.030114442019668092,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.030114442019668092
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.040428099613956346,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.040428099613956346
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.01877605231961963,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.01877605231961963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.034849415144292316,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.034849415144292316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.030381931949990414,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.030381931949990414
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.016740929047162692,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.016740929047162692
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5520231213872833,
"acc_stderr": 0.02677299065336182,
"acc_norm": 0.5520231213872833,
"acc_norm_stderr": 0.02677299065336182
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32625698324022345,
"acc_stderr": 0.01568044151888918,
"acc_norm": 0.32625698324022345,
"acc_norm_stderr": 0.01568044151888918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.027767689606833932,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.027767689606833932
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970472,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39048239895697523,
"acc_stderr": 0.012460135913945078,
"acc_norm": 0.39048239895697523,
"acc_norm_stderr": 0.012460135913945078
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714867,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714867
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.0202239460050743,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.0202239460050743
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.03038726291954773,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.03038726291954773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.038057975055904594,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.038057975055904594
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4036948798131458,
"mc2_stderr": 0.015288577859532484
},
"harness|winogrande|5": {
"acc": 0.6677190213101816,
"acc_stderr": 0.013238316554236528
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.006216328640238133
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BangumiBase/koisuruasteroid | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Koisuru Asteroid
This is the image base of bangumi Koisuru Asteroid, we detected 31 characters, 2450 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 501 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 15 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 22 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 40 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 222 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 45 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 14 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 94 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 425 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 14 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 18 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 26 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 17 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 114 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 27 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 15 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 13 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 39 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 245 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 27 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 39 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 187 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 9 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 15 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 12 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 33 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 12 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 71 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 6 | [Download](28/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 29 | 5 | [Download](29/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 128 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
huggingface/autotrain-data-tchhh2 | Invalid username or password. |
open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b | ---
pretty_name: Evaluation run of Delcos/Mistral-Pygmalion-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Delcos/Mistral-Pygmalion-7b](https://huggingface.co/Delcos/Mistral-Pygmalion-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T13:05:25.339926](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b/blob/main/results_2023-10-28T13-05-25.339926.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196460774,\n \"f1\": 0.05936241610738259,\n\
\ \"f1_stderr\": 0.0013656193493625718,\n \"acc\": 0.41059662883495607,\n\
\ \"acc_stderr\": 0.009533380943461503\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460774,\n\
\ \"f1\": 0.05936241610738259,\n \"f1_stderr\": 0.0013656193493625718\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855575\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Delcos/Mistral-Pygmalion-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|arc:challenge|25_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T13_05_25.339926
path:
- '**/details_harness|drop|3_2023-10-28T13-05-25.339926.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T13-05-25.339926.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T13_05_25.339926
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-05-25.339926.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-05-25.339926.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hellaswag|10_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T13_05_25.339926
path:
- '**/details_harness|winogrande|5_2023-10-28T13-05-25.339926.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T13-05-25.339926.parquet'
- config_name: results
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- results_2023-10-10T20-14-17.715432.parquet
- split: 2023_10_28T13_05_25.339926
path:
- results_2023-10-28T13-05-25.339926.parquet
- split: latest
path:
- results_2023-10-28T13-05-25.339926.parquet
---
# Dataset Card for Evaluation run of Delcos/Mistral-Pygmalion-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Delcos/Mistral-Pygmalion-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Delcos/Mistral-Pygmalion-7b](https://huggingface.co/Delcos/Mistral-Pygmalion-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T13:05:25.339926](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b/blob/main/results_2023-10-28T13-05-25.339926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460774,
"f1": 0.05936241610738259,
"f1_stderr": 0.0013656193493625718,
"acc": 0.41059662883495607,
"acc_stderr": 0.009533380943461503
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460774,
"f1": 0.05936241610738259,
"f1_stderr": 0.0013656193493625718
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855575
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
arthurmluz/GPTextSum_data-wiki_cstnews_1024_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 30882
num_examples: 20
download_size: 39589
dataset_size: 30882
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum_data-wiki_cstnews_1024_results"
rouge= {'rouge1': 0.4308100446730652, 'rouge2': 0.21805946047970845, 'rougeL': 0.32857244937346486, 'rougeLsum': 0.32857244937346486}
bert= {'precision': 0.7545565485954284, 'recall': 0.7717844933271408, 'f1': 0.762201464176178} |
bigscience-data/roots_zh-cn_wikipedia | ---
language: zh
language_bcp47:
- zh-CN
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_zh-cn_wikipedia
# wikipedia
- Dataset uid: `wikipedia`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 3.2299 % of total
- 4.2071 % of en
- 5.6773 % of ar
- 3.3416 % of fr
- 5.2815 % of es
- 12.4852 % of ca
- 0.4288 % of zh
- 0.4286 % of zh
- 5.4743 % of indic-bn
- 8.9062 % of indic-ta
- 21.3313 % of indic-te
- 4.4845 % of pt
- 4.0493 % of indic-hi
- 11.3163 % of indic-ml
- 22.5300 % of indic-ur
- 4.4902 % of vi
- 16.9916 % of indic-kn
- 24.7820 % of eu
- 11.6241 % of indic-mr
- 9.8749 % of id
- 9.3489 % of indic-pa
- 9.4767 % of indic-gu
- 24.1132 % of indic-as
- 5.3309 % of indic-or
### BigScience processing steps
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ca
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: zh
#### Filters applied to: zh
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-as
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-or
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
|
Broomva/instruct-spa-guc | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 29316120
num_examples: 128889
download_size: 13852155
dataset_size: 29316120
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hahad14/Parkinsons_Disease_Speech | ---
license: mit
---
|
argilla/mistral-vs-llama | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for mistral-vs-llama
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/mistral-vs-llama")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/mistral-vs-llama")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| input | Input | TextField | True | False |
| response_1 | Response_1 | TextField | True | True |
| response_2 | Response_2 | TextField | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| ranking | Rank the responses | RankingQuestion | True | N/A | ['response_1', 'response_2'] |
| rationale_1 | Rationale behind response_1's ranking? | TextQuestion | True | N/A | N/A |
| rationale_2 | Rationale behind response_2's ranking? | TextQuestion | True | N/A | N/A |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"fields": {
"input": "How do I locate my card?",
"response_1": " Thank you for reaching out! I\u0027m here to help you with your query. However, I must point out that the term \"locate\" can have different meanings depending on the context. Could you please provide more details or clarify what you mean by \"locate\"? Are you trying to find a specific card, such as a credit card or debit card? Or perhaps you\u0027re looking for something else entirely? Once I have a better understanding of your question, I\u0027ll be happy to assist you in a responsible and accurate manner.",
"response_2": " Of course! I\u0027d be happy to help you locate your missing card. To start, could you please tell me more about the card you\u0027re trying to find? Is it a credit card, debit card, or another type of card? Additionally, can you provide more details about the circumstances surrounding its disappearance? For example, did you lose it at home, in a store, or somewhere else? Any additional information you can provide will help me give you more tailored advice on how to locate your card."
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"values": {
"ranking": {
"value": [
{
"rank": 2,
"value": "response_1"
},
{
"rank": 1,
"value": "response_2"
}
]
},
"rationale_1": {
"value": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative."
},
"rationale_2": {
"value": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful."
}
}
}
],
"suggestions": [
{
"question_id": "9820a347-3573-46cf-88e3-c172e02c6f96",
"question_name": "ranking",
"value": [
{
"rank": 1,
"value": "response_2"
},
{
"rank": 2,
"value": "response_1"
}
]
},
{
"agent": "gpt-4",
"question_id": "177a00ed-8536-4cd5-a755-d4436cfd1455",
"question_name": "rationale_1",
"value": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative."
},
{
"agent": "gpt-4",
"question_id": "7bd8451e-4c63-491b-afd8-8aabb67ce62f",
"question_name": "rationale_2",
"value": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful."
}
]
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"input": "How do I locate my card?",
"metadata": "{}",
"ranking": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"value": {
"rank": [
2,
1
],
"value": [
"response_1",
"response_2"
]
}
}
],
"ranking-suggestion": {
"rank": [
1,
2
],
"value": [
"response_2",
"response_1"
]
},
"ranking-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"rationale_1": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"value": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative."
}
],
"rationale_1-suggestion": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative.",
"rationale_1-suggestion-metadata": {
"agent": "gpt-4",
"score": null,
"type": null
},
"rationale_2": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"value": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful."
}
],
"rationale_2-suggestion": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful.",
"rationale_2-suggestion-metadata": {
"agent": "gpt-4",
"score": null,
"type": null
},
"response_1": " Thank you for reaching out! I\u0027m here to help you with your query. However, I must point out that the term \"locate\" can have different meanings depending on the context. Could you please provide more details or clarify what you mean by \"locate\"? Are you trying to find a specific card, such as a credit card or debit card? Or perhaps you\u0027re looking for something else entirely? Once I have a better understanding of your question, I\u0027ll be happy to assist you in a responsible and accurate manner.",
"response_2": " Of course! I\u0027d be happy to help you locate your missing card. To start, could you please tell me more about the card you\u0027re trying to find? Is it a credit card, debit card, or another type of card? Additionally, can you provide more details about the circumstances surrounding its disappearance? For example, did you lose it at home, in a store, or somewhere else? Any additional information you can provide will help me give you more tailored advice on how to locate your card."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **input** is of type `TextField`.
* **response_1** is of type `TextField`.
* **response_2** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **ranking** is of type `RankingQuestion` with the following allowed values ['response_1', 'response_2'].
* **rationale_1** is of type `TextQuestion`.
* **rationale_2** is of type `TextQuestion`.
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **ranking-suggestion** is of type `ranking` with the following allowed values ['response_1', 'response_2'].
* (optional) **rationale_1-suggestion** is of type `text`.
* (optional) **rationale_2-suggestion** is of type `text`.
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
notwice/test_up | ---
license: afl-3.0
---
|
arthurmluz/xlsum_data-wiki_gptextsum_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
splits:
- name: validation
num_bytes: 26768065
num_examples: 7175
download_size: 16308689
dataset_size: 26768065
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "xlsum_data-wiki_gptextsum_results"
rouge= {'rouge1': 0.25431289948140545, 'rouge2': 0.06235500652708138, 'rougeL': 0.16542968042081027, 'rougeLsum': 0.16542968042081027}
bert= {'precision': 0.6915352174463172, 'recall': 0.7106410252424898, 'f1': 0.7006534646030917} |
open-llm-leaderboard/details_Corianas__Quokka_590m | ---
pretty_name: Evaluation run of Corianas/Quokka_590m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Corianas/Quokka_590m](https://huggingface.co/Corianas/Quokka_590m) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__Quokka_590m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T16:50:32.705897](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_590m/blob/main/results_2023-10-14T16-50-32.705897.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n\
\ \"em_stderr\": 0.0006192871806511148,\n \"f1\": 0.03957634228187927,\n\
\ \"f1_stderr\": 0.0012672315965293443,\n \"acc\": 0.2509865824782952,\n\
\ \"acc_stderr\": 0.0070261881296128145\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511148,\n\
\ \"f1\": 0.03957634228187927,\n \"f1_stderr\": 0.0012672315965293443\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n\
\ \"acc_stderr\": 0.014052376259225629\n }\n}\n```"
repo_url: https://huggingface.co/Corianas/Quokka_590m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|arc:challenge|25_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_14T16_50_32.705897
path:
- '**/details_harness|drop|3_2023-10-14T16-50-32.705897.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T16-50-32.705897.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_14T16_50_32.705897
path:
- '**/details_harness|gsm8k|5_2023-10-14T16-50-32.705897.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T16-50-32.705897.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hellaswag|10_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T09:57:25.772408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T09:57:25.772408.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_14T16_50_32.705897
path:
- '**/details_harness|winogrande|5_2023-10-14T16-50-32.705897.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T16-50-32.705897.parquet'
- config_name: results
data_files:
- split: 2023_07_24T09_57_25.772408
path:
- results_2023-07-24T09:57:25.772408.parquet
- split: 2023_10_14T16_50_32.705897
path:
- results_2023-10-14T16-50-32.705897.parquet
- split: latest
path:
- results_2023-10-14T16-50-32.705897.parquet
---
# Dataset Card for Evaluation run of Corianas/Quokka_590m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/Quokka_590m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/Quokka_590m](https://huggingface.co/Corianas/Quokka_590m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__Quokka_590m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T16:50:32.705897](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_590m/blob/main/results_2023-10-14T16-50-32.705897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511148,
"f1": 0.03957634228187927,
"f1_stderr": 0.0012672315965293443,
"acc": 0.2509865824782952,
"acc_stderr": 0.0070261881296128145
},
"harness|drop|3": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511148,
"f1": 0.03957634228187927,
"f1_stderr": 0.0012672315965293443
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225629
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-elementary_mathematics-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6417
num_examples: 5
- name: test
num_bytes: 1326061
num_examples: 378
download_size: 179519
dataset_size: 1332478
---
# Dataset Card for "mmlu-elementary_mathematics-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zhadix/Mima | ---
language:
- ja
- en
tags:
- music
- art
pretty_name: Images&GR
size_categories:
- n<1K
--- |
egoing/dataset_repository_name | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Hamid-reza/Adv-small-persian-QA | ---
dataset_info:
features:
- name: id
dtype: int32
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 2728312
num_examples: 1261
- name: validation
num_bytes: 276911
num_examples: 130
download_size: 1500971
dataset_size: 3005223
---
# Dataset Card for "Adv-small-persian-QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thorirhrafn/domar_data_split | ---
dataset_info:
features:
- name: Title
dtype: string
- name: Keywords
dtype: string
- name: Summary
dtype: string
- name: Text
dtype: string
splits:
- name: train
num_bytes: 206427013.2
num_examples: 7461
- name: eval
num_bytes: 11454333.663691195
num_examples: 414
- name: test
num_bytes: 11482001.136308806
num_examples: 415
download_size: 112632160
dataset_size: 229363348.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
---
|
version-control/ds-lib-version-test | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: version
list:
- name: pyproject.toml
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: requirements.txt
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: setup.py
struct:
- name: matplotlib
dtype: string
- name: numpy
dtype: string
- name: pandas
dtype: string
- name: scikit-learn
dtype: string
- name: scipy
dtype: string
- name: tensorflow
dtype: string
- name: torch
dtype: string
- name: hexsha
sequence: string
splits:
- name: train
num_bytes: 2524
num_examples: 10
download_size: 13601
dataset_size: 2524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shidowake/augmxnt_ultra-orca-boros-en-ja-v1_split_13 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 20639999.933149945
num_examples: 9397
download_size: 10584481
dataset_size: 20639999.933149945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/squad_qa_no_id_v5_full_recite_ans_sent_random_permute_rerun_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10270378.612347052
num_examples: 6305
- name: validation
num_bytes: 402971
num_examples: 300
download_size: 1637368
dataset_size: 10673349.612347052
---
# Dataset Card for "squad_qa_no_id_v5_full_recite_ans_sent_random_permute_rerun_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/Dialogsum-german | ---
dataset_info:
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 12886236
num_examples: 12460
- name: test
num_bytes: 1539886
num_examples: 1500
- name: validation
num_bytes: 504815
num_examples: 500
download_size: 7946878
dataset_size: 14930937
---
# Dataset Card for "Dialogsum-german"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelginn/bert_data_tokenized_grouped | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 19019398080.0
num_examples: 6167120
download_size: 6396096898
dataset_size: 19019398080.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wilsonslz/GOLDENINFANTIL | ---
license: openrail
---
|
ftuncc/doctorchat | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4294895
num_examples: 3325
download_size: 2390573
dataset_size: 4294895
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.