datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
rjaiswal/bulgari | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 165217688.0
num_examples: 233
download_size: 163391080
dataset_size: 165217688.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1 | ---
pretty_name: Evaluation run of mrfakename/NeuralOrca-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mrfakename/NeuralOrca-7B-v1](https://huggingface.co/mrfakename/NeuralOrca-7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T17:53:31.960115](https://huggingface.co/datasets/open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1/blob/main/results_2023-12-04T17-53-31.960115.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6385330990221446,\n\
\ \"acc_stderr\": 0.032248165389573695,\n \"acc_norm\": 0.6406523603337572,\n\
\ \"acc_norm_stderr\": 0.032892154968215216,\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.01689818070697389,\n \"mc2\": 0.5457774305208005,\n\
\ \"mc2_stderr\": 0.015413416681633433\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349814,\n\
\ \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.01391303452962045\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n\
\ \"acc_stderr\": 0.004714386376337136,\n \"acc_norm\": 0.8507269468233419,\n\
\ \"acc_norm_stderr\": 0.0035562912320503525\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973138,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666789,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666789\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730583,\n\
\ \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730583\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n \"\
acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.01689818070697389,\n \"mc2\": 0.5457774305208005,\n\
\ \"mc2_stderr\": 0.015413416681633433\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249784\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5845337376800607,\n \
\ \"acc_stderr\": 0.013574222625031811\n }\n}\n```"
repo_url: https://huggingface.co/mrfakename/NeuralOrca-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-53-31.960115.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- '**/details_harness|winogrande|5_2023-12-04T17-53-31.960115.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T17-53-31.960115.parquet'
- config_name: results
data_files:
- split: 2023_12_04T17_53_31.960115
path:
- results_2023-12-04T17-53-31.960115.parquet
- split: latest
path:
- results_2023-12-04T17-53-31.960115.parquet
---
# Dataset Card for Evaluation run of mrfakename/NeuralOrca-7B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mrfakename/NeuralOrca-7B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mrfakename/NeuralOrca-7B-v1](https://huggingface.co/mrfakename/NeuralOrca-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:53:31.960115](https://huggingface.co/datasets/open-llm-leaderboard/details_mrfakename__NeuralOrca-7B-v1/blob/main/results_2023-12-04T17-53-31.960115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6385330990221446,
"acc_stderr": 0.032248165389573695,
"acc_norm": 0.6406523603337572,
"acc_norm_stderr": 0.032892154968215216,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697389,
"mc2": 0.5457774305208005,
"mc2_stderr": 0.015413416681633433
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349814,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.01391303452962045
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337136,
"acc_norm": 0.8507269468233419,
"acc_norm_stderr": 0.0035562912320503525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973138,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666789,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666789
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000318,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128438,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128438
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697389,
"mc2": 0.5457774305208005,
"mc2_stderr": 0.015413416681633433
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.011493384687249784
},
"harness|gsm8k|5": {
"acc": 0.5845337376800607,
"acc_stderr": 0.013574222625031811
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ruanchaves/reli-sa_por_Latn_to_spa_Latn | ---
dataset_info:
features:
- name: source
dtype: string
- name: title
dtype: string
- name: book
dtype: string
- name: review_id
dtype: string
- name: score
dtype: float64
- name: sentence_id
dtype: int64
- name: unique_review_id
dtype: string
- name: sentence
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 1833644
num_examples: 7875
- name: validation
num_bytes: 323687
num_examples: 1348
- name: test
num_bytes: 673218
num_examples: 3288
download_size: 0
dataset_size: 2830549
---
# Dataset Card for "reli-sa_por_Latn_to_spa_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arbml/Sudanese_Dialect_Tweet_Tele | ---
dataset_info:
features:
- name: Tweet ID
dtype: string
- name: Tweet Text
dtype: string
- name: Date
dtype: string
- name: label
dtype:
class_label:
names:
0: NEGATIVE
1: POSITIVE
2: OBJECTIVE
splits:
- name: train
num_bytes: 872272
num_examples: 5346
download_size: 353611
dataset_size: 872272
---
# Dataset Card for "Sudanese_Dialect_Tweet_Tele"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
12345testing/echo_testing | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 524579.0
num_examples: 8
download_size: 525593
dataset_size: 524579.0
---
# Dataset Card for "echo_testing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6 | ---
pretty_name: Evaluation run of yeontaek/llama-2-13B-ensemble-v6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-13B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T05:52:04.564811](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6/blob/main/results_2023-08-30T05%3A52%3A04.564811.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5732546102102893,\n\
\ \"acc_stderr\": 0.034192375404008664,\n \"acc_norm\": 0.5769517967834359,\n\
\ \"acc_norm_stderr\": 0.034176064530211395,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.01658630490176256,\n \"mc2\": 0.5264024071528917,\n\
\ \"mc2_stderr\": 0.016382172245984476\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5034129692832765,\n \"acc_stderr\": 0.014611050403244081,\n\
\ \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076136\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6101374228241386,\n\
\ \"acc_stderr\": 0.004867221634461273,\n \"acc_norm\": 0.8095000995817566,\n\
\ \"acc_norm_stderr\": 0.003918928556590479\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042767,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042767\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.024985354923102325,\n\
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.024985354923102325\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.03228410626716391,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.03228410626716391\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803067,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803067\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960415,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960415\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240658,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240658\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722926,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722926\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905718,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905718\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.01624202883405362,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.01624202883405362\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291488,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291488\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.012682016335646673,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.012682016335646673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03025437257397671,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03025437257397671\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5816993464052288,\n \"acc_stderr\": 0.01995597514583555,\n \
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.01995597514583555\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.01658630490176256,\n \"mc2\": 0.5264024071528917,\n\
\ \"mc2_stderr\": 0.016382172245984476\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|arc:challenge|25_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hellaswag|10_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T05:52:04.564811.parquet'
- config_name: results
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- results_2023-08-30T05:52:04.564811.parquet
- split: latest
path:
- results_2023-08-30T05:52:04.564811.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-13B-ensemble-v6
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T05:52:04.564811](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6/blob/main/results_2023-08-30T05%3A52%3A04.564811.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5732546102102893,
"acc_stderr": 0.034192375404008664,
"acc_norm": 0.5769517967834359,
"acc_norm_stderr": 0.034176064530211395,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.01658630490176256,
"mc2": 0.5264024071528917,
"mc2_stderr": 0.016382172245984476
},
"harness|arc:challenge|25": {
"acc": 0.5034129692832765,
"acc_stderr": 0.014611050403244081,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076136
},
"harness|hellaswag|10": {
"acc": 0.6101374228241386,
"acc_stderr": 0.004867221634461273,
"acc_norm": 0.8095000995817566,
"acc_norm_stderr": 0.003918928556590479
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042767,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042767
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.024985354923102325,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.024985354923102325
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.03228410626716391,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.03228410626716391
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803067,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803067
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960415,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960415
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240658,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240658
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722926,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722926
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905718,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905718
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.01624202883405362,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.01624202883405362
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291488,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291488
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.012682016335646673,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.012682016335646673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03025437257397671,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03025437257397671
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.01995597514583555,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.01995597514583555
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.01658630490176256,
"mc2": 0.5264024071528917,
"mc2_stderr": 0.016382172245984476
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
cahya/fleurs | ---
annotations_creators:
- expert-generated
- crowdsourced
- machine-generated
language_creators:
- crowdsourced
- expert-generated
language:
- afr
- amh
- ara
- asm
- ast
- azj
- bel
- ben
- bos
- cat
- ceb
- cmn
- ces
- cym
- dan
- deu
- ell
- eng
- spa
- est
- fas
- ful
- fin
- tgl
- fra
- gle
- glg
- guj
- hau
- heb
- hin
- hrv
- hun
- hye
- ind
- ibo
- isl
- ita
- jpn
- jav
- kat
- kam
- kea
- kaz
- khm
- kan
- kor
- ckb
- kir
- ltz
- lug
- lin
- lao
- lit
- luo
- lav
- mri
- mkd
- mal
- mon
- mar
- msa
- mlt
- mya
- nob
- npi
- nld
- nso
- nya
- oci
- orm
- ory
- pan
- pol
- pus
- por
- ron
- rus
- bul
- snd
- slk
- slv
- sna
- som
- srp
- swe
- swh
- tam
- tel
- tgk
- tha
- tur
- ukr
- umb
- urd
- uzb
- vie
- wol
- xho
- yor
- yue
- zul
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
task_categories:
- automatic-speech-recognition
task_ids: []
pretty_name: 'The Cross-lingual TRansfer Evaluation of Multilingual Encoders for Speech
(XTREME-S) benchmark is a benchmark designed to evaluate speech representations
across languages, tasks, domains and data regimes. It covers 102 languages from
10+ language families, 3 different domains and 4 task families: speech recognition,
translation, classification and retrieval.'
tags:
- speech-recognition
---
# FLEURS
## Dataset Description
- **Fine-Tuning script:** [pytorch/speech-recognition](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition)
- **Paper:** [FLEURS: Few-shot Learning Evaluation of
Universal Representations of Speech](https://arxiv.org/abs/2205.12446)
- **Total amount of disk used:** ca. 350 GB
Fleurs is the speech version of the [FLoRes machine translation benchmark](https://arxiv.org/abs/2106.03193).
We use 2009 n-way parallel sentences from the FLoRes dev and devtest publicly available sets, in 102 languages.
Training sets have around 10 hours of supervision. Speakers of the train sets are different than speakers from the dev/test sets. Multilingual fine-tuning is
used and ”unit error rate” (characters, signs) of all languages is averaged. Languages and results are also grouped into seven geographical areas:
- **Western Europe**: *Asturian, Bosnian, Catalan, Croatian, Danish, Dutch, English, Finnish, French, Galician, German, Greek, Hungarian, Icelandic, Irish, Italian, Kabuverdianu, Luxembourgish, Maltese, Norwegian, Occitan, Portuguese, Spanish, Swedish, Welsh*
- **Eastern Europe**: *Armenian, Belarusian, Bulgarian, Czech, Estonian, Georgian, Latvian, Lithuanian, Macedonian, Polish, Romanian, Russian, Serbian, Slovak, Slovenian, Ukrainian*
- **Central-Asia/Middle-East/North-Africa**: *Arabic, Azerbaijani, Hebrew, Kazakh, Kyrgyz, Mongolian, Pashto, Persian, Sorani-Kurdish, Tajik, Turkish, Uzbek*
- **Sub-Saharan Africa**: *Afrikaans, Amharic, Fula, Ganda, Hausa, Igbo, Kamba, Lingala, Luo, Northern-Sotho, Nyanja, Oromo, Shona, Somali, Swahili, Umbundu, Wolof, Xhosa, Yoruba, Zulu*
- **South-Asia**: *Assamese, Bengali, Gujarati, Hindi, Kannada, Malayalam, Marathi, Nepali, Oriya, Punjabi, Sindhi, Tamil, Telugu, Urdu*
- **South-East Asia**: *Burmese, Cebuano, Filipino, Indonesian, Javanese, Khmer, Lao, Malay, Maori, Thai, Vietnamese*
- **CJK languages**: *Cantonese and Mandarin Chinese, Japanese, Korean*
## Supported Tasks
### 1. Speech Recognition (ASR)
```py
from datasets import load_dataset
fleurs_asr = load_dataset("google/fleurs", "af_za") # for Afrikaans
# to download all data for multi-lingual fine-tuning uncomment following line
# fleurs_asr = load_dataset("google/fleurs", "all")
# see structure
print(fleurs_asr)
# load audio sample on the fly
audio_input = fleurs_asr["train"][0]["audio"] # first decoded audio sample
transcription = fleurs_asr["train"][0]["transcription"] # first transcription
# use `audio_input` and `transcription` to fine-tune your model for ASR
# for analyses see language groups
all_language_groups = fleurs_asr["train"].features["lang_group_id"].names
lang_group_id = fleurs_asr["train"][0]["lang_group_id"]
all_language_groups[lang_group_id]
```
### 2. Language Identification
LangID can often be a domain classification, but in the case of FLEURS-LangID, recordings are done in a similar setting across languages and the utterances correspond to n-way parallel sentences, in the exact same domain, making this task particularly relevant for evaluating LangID. The setting is simple, FLEURS-LangID is splitted in train/valid/test for each language. We simply create a single train/valid/test for LangID by merging all.
```py
from datasets import load_dataset
fleurs_langID = load_dataset("google/fleurs", "all") # to download all data
# see structure
print(fleurs_langID)
# load audio sample on the fly
audio_input = fleurs_langID["train"][0]["audio"] # first decoded audio sample
language_class = fleurs_langID["train"][0]["lang_id"] # first id class
language = fleurs_langID["train"].features["lang_id"].names[language_class]
# use audio_input and language_class to fine-tune your model for audio classification
```
### 3. Retrieval
Retrieval provides n-way parallel speech and text data. Similar to how XTREME for text leverages Tatoeba to evaluate bitext mining a.k.a sentence translation retrieval, we use Retrieval to evaluate the quality of fixed-size representations of speech utterances. Our goal is to incentivize the creation of fixed-size speech encoder for speech retrieval. The system has to retrieve the English "key" utterance corresponding to the speech translation of "queries" in 15 languages. Results have to be reported on the test sets of Retrieval whose utterances are used as queries (and keys for English). We augment the English keys with a large number of utterances to make the task more difficult.
```py
from datasets import load_dataset
fleurs_retrieval = load_dataset("google/fleurs", "af_za") # for Afrikaans
# to download all data for multi-lingual fine-tuning uncomment following line
# fleurs_retrieval = load_dataset("google/fleurs", "all")
# see structure
print(fleurs_retrieval)
# load audio sample on the fly
audio_input = fleurs_retrieval["train"][0]["audio"] # decoded audio sample
text_sample_pos = fleurs_retrieval["train"][0]["transcription"] # positive text sample
text_sample_neg = fleurs_retrieval["train"][1:20]["transcription"] # negative text samples
# use `audio_input`, `text_sample_pos`, and `text_sample_neg` to fine-tune your model for retrieval
```
Users can leverage the training (and dev) sets of FLEURS-Retrieval with a ranking loss to build better cross-lingual fixed-size representations of speech.
## Dataset Structure
We show detailed information the example configurations `af_za` of the dataset.
All other configurations have the same structure.
### Data Instances
**af_za**
- Size of downloaded dataset files: 1.47 GB
- Size of the generated dataset: 1 MB
- Total amount of disk used: 1.47 GB
An example of a data instance of the config `af_za` looks as follows:
```
{'id': 91,
'num_samples': 385920,
'path': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/310a663d52322700b3d3473cbc5af429bd92a23f9bc683594e70bc31232db39e/home/vaxelrod/FLEURS/oss2_obfuscated/af_za/audio/train/17797742076841560615.wav',
'audio': {'path': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/310a663d52322700b3d3473cbc5af429bd92a23f9bc683594e70bc31232db39e/home/vaxelrod/FLEURS/oss2_obfuscated/af_za/audio/train/17797742076841560615.wav',
'array': array([ 0.0000000e+00, 0.0000000e+00, 0.0000000e+00, ...,
-1.1205673e-04, -8.4638596e-05, -1.2731552e-04], dtype=float32),
'sampling_rate': 16000},
'raw_transcription': 'Dit is nog nie huidiglik bekend watter aantygings gemaak sal word of wat owerhede na die seun gelei het nie maar jeugmisdaad-verrigtinge het in die federale hof begin',
'transcription': 'dit is nog nie huidiglik bekend watter aantygings gemaak sal word of wat owerhede na die seun gelei het nie maar jeugmisdaad-verrigtinge het in die federale hof begin',
'gender': 0,
'lang_id': 0,
'language': 'Afrikaans',
'lang_group_id': 3}
```
### Data Fields
The data fields are the same among all splits.
- **id** (int): ID of audio sample
- **num_samples** (int): Number of float values
- **path** (str): Path to the audio file
- **audio** (dict): Audio object including loaded audio array, sampling rate and path ot audio
- **raw_transcription** (str): The non-normalized transcription of the audio file
- **transcription** (str): Transcription of the audio file
- **gender** (int): Class id of gender
- **lang_id** (int): Class id of language
- **lang_group_id** (int): Class id of language group
### Data Splits
Every config only has the `"train"` split containing of *ca.* 1000 examples, and a `"validation"` and `"test"` split each containing of *ca.* 400 examples.
## Dataset Creation
We collect between one and three recordings for each sentence (2.3 on average), and buildnew train-dev-test splits with 1509, 150 and 350 sentences for
train, dev and test respectively.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is meant to encourage the development of speech technology in a lot more languages of the world. One of the goal is to give equal access to technologies like speech recognition or speech translation to everyone, meaning better dubbing or better access to content from the internet (like podcasts, streaming or videos).
### Discussion of Biases
Most datasets have a fair distribution of gender utterances (e.g. the newly introduced FLEURS dataset). While many languages are covered from various regions of the world, the benchmark misses many languages that are all equally important. We believe technology built through FLEURS should generalize to all languages.
### Other Known Limitations
The dataset has a particular focus on read-speech because common evaluation benchmarks like CoVoST-2 or LibriSpeech evaluate on this type of speech. There is sometimes a known mismatch between performance obtained in a read-speech setting and a more noisy setting (in production for instance). Given the big progress that remains to be made on many languages, we believe better performance on FLEURS should still correlate well with actual progress made for speech understanding.
## Additional Information
All datasets are licensed under the [Creative Commons license (CC-BY)](https://creativecommons.org/licenses/).
### Citation Information
You can access the FLEURS paper at https://arxiv.org/abs/2205.12446.
Please cite the paper when referencing the FLEURS corpus as:
```
@article{fleurs2022arxiv,
title = {FLEURS: Few-shot Learning Evaluation of Universal Representations of Speech},
author = {Conneau, Alexis and Ma, Min and Khanuja, Simran and Zhang, Yu and Axelrod, Vera and Dalmia, Siddharth and Riesa, Jason and Rivera, Clara and Bapna, Ankur},
journal={arXiv preprint arXiv:2205.12446},
url = {https://arxiv.org/abs/2205.12446},
year = {2022},
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) and [@aconneau](https://github.com/aconneau) for adding this dataset.
|
tigerbhai/tigerbhai | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aisyahhrazak/crawl-fliphtml | ---
language:
- ms
- en
---
Fliphtml pdf text version
Search Query:
- Melayu |
HoangHa/alpaca_vi | ---
license: apache-2.0
---
|
bigscience-data/roots_indic-hi_indic_nlp_corpus | ---
language: hi
license: cc-by-nc-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-hi_indic_nlp_corpus
# Indic NLP Corpus
- Dataset uid: `indic_nlp_corpus`
### Description
The IndicNLP corpus is a largescale, general-domain corpus containing 2.7 billion words for 10 Indian languages from two language families. s (IndoAryan branch and Dravidian). Each language has at least 100 million words (except Oriya).
### Homepage
https://github.com/AI4Bharat/indicnlp_corpus#publicly-available-classification-datasets
### Licensing
- non-commercial use
- cc-by-nc-sa-4.0: Creative Commons Attribution Non Commercial Share Alike 4.0 International
### Speaker Locations
- Southern Asia
- India
### Sizes
- 3.4019 % of total
- 44.4368 % of indic-hi
- 64.2943 % of indic-ta
- 70.5374 % of indic-ml
- 54.2394 % of indic-te
- 55.9105 % of indic-kn
- 61.6111 % of indic-mr
- 67.2242 % of indic-pa
- 68.1470 % of indic-or
- 64.3879 % of indic-gu
- 4.1495 % of indic-bn
### BigScience processing steps
#### Filters applied to: indic-hi
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: indic-gu
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
mask-distilled-one-sec-cv12/chunk_94 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1235253004
num_examples: 242587
download_size: 1261067755
dataset_size: 1235253004
---
# Dataset Card for "chunk_94"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/snips_chatgpt_20pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1034367
num_examples: 13084
download_size: 414459
dataset_size: 1034367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
csupiisc/plmn1.5l | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 754298
num_examples: 10000
download_size: 299510
dataset_size: 754298
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "plmn1.5l"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LukeEuser/docvqa_50_50_test_unanswerable_questions | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: query
struct:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: float32
length: 4
- name: answer
struct:
- name: match_score
dtype: float64
- name: matched_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: ground_truth
dtype: string
splits:
- name: test
num_bytes: 35194134.0
num_examples: 100
download_size: 11860250
dataset_size: 35194134.0
---
# Dataset Card for "docvqa_50_50_test_unanswerable_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713014351 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11368
num_examples: 29
download_size: 8719
dataset_size: 11368
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713014351"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kevinsguo/test998 | ---
license: apache-2.0
tags:
- helloword
--- |
divi7007/diviO | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 20361096
num_examples: 497
download_size: 6945517
dataset_size: 20361096
---
# Dataset Card for "diviO"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AItestaccount/LLMPrompts | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 7352
num_examples: 10
download_size: 9937
dataset_size: 7352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
astarostap/autonlp-data-antisemitism-2 | ---
language:
- en
task_categories:
- text-classification
---
# AutoNLP Dataset for project: antisemitism-2
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project antisemitism-2.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"target": 0,
"text": "Jew pods"
},
{
"target": 1,
"text": "@PotatoLaydee He's a Jew...."
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "ClassLabel(num_classes=2, names=['0', '1'], names_file=None, id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 3161 |
| valid | 791 |
|
ihanif/markhor-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1008453.0
num_examples: 15
download_size: 1005068
dataset_size: 1008453.0
---
# Dataset Card for "markhor-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/wikiclir_pt | ---
pretty_name: '`wikiclir/pt`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/pt`
The `wikiclir/pt` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/pt).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=973,057
- `queries` (i.e., topics); count=611,732
- `qrels`: (relevance assessments); count=1,741,889
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_pt', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_pt', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_pt', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
VibhuRaj01/Bone_Tumor | ---
task_categories:
- object-detection
tags:
- biology
- medical
size_categories:
- 1K<n<10K
---
---
Description:
- This dataset comprises images of bone cancer annotated with bounding boxes for object detection tasks. It is a combination of two distinct datasets: one sourced from Roboflow, featuring images of tumor-affected bones, and another obtained from the FracAtlas dataset, containing images of healthy bones.
---
Task:
- Object Detection
- Classification
---
Annotations:
- Bounding Boxes
---
Data Source:
- Roboflow Dataset: Contains images of bones affected by tumors, sourced from Roboflow.
- FracAtlas Dataset: Comprises images of healthy bones, extracted from the FracAtlas dataset.
--- |
fiveflow/instruction_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 28123766
num_examples: 44905
download_size: 15302646
dataset_size: 28123766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instruction_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v4_test_720000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 6734938.5
num_examples: 18000
- name: test
num_bytes: 748326.5
num_examples: 2000
download_size: 3226399
dataset_size: 7483265.0
---
# Dataset Card for "final_train_v4_test_720000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tomekkorbak/pile-pii-scrubadub | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
pretty_name: pile-pii-scrubadub
size_categories:
- 1M<n<10M
source_datasets:
- extended|the_pile
tags:
- pii
- personal
- identifiable
- information
- pretraining-with-human-feedback
task_categories:
- text-classification
- other
task_ids:
- acceptability-classification
- text-scoring
---
# Dataset Card for pile-pii-scrubadub
## Dataset Description
- **Repository: https://github.com/tomekkorbak/aligned-pretraining-objectives**
- **Paper: Arxiv link to be added**
### Dataset Summary
This dataset contains text from [The Pile](https://huggingface.co/datasets/the_pile), annotated based on the personal idenfitiable information (PII) in each sentence.
Each document (row in the dataset) is segmented into sentences, and each sentence is given a score: the percentage of words in it that are classified as PII by [Scrubadub](https://scrubadub.readthedocs.io/en/stable/).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
This dataset is taken from [The Pile](https://huggingface.co/datasets/the_pile), which is English text.
## Dataset Structure
### Data Instances
1949977
### Data Fields
- texts (sequence): a list of the sentences in the document (segmented using [SpaCy](https://spacy.io/))
- meta (dict): the section of [The Pile](https://huggingface.co/datasets/the_pile) from which it originated
- scores (sequence): a score for each sentence in the `texts` column indicating the percent of words that are detected as PII by [Scrubadub](https://scrubadub.readthedocs.io/en/stable/)
- avg_score (float64): the average of the scores listed in the `scores` column
- num_sents (int64): the number of sentences (and scores) in that document
### Data Splits
Training set only
## Dataset Creation
### Curation Rationale
This is labeled text from [The Pile](https://huggingface.co/datasets/the_pile), a large dataset of text in English. The PII is labeled so that generative language models can be trained to avoid generating PII.
### Source Data
#### Initial Data Collection and Normalization
This is labeled text from [The Pile](https://huggingface.co/datasets/the_pile).
#### Who are the source language producers?
Please see [The Pile](https://huggingface.co/datasets/the_pile) for the source of the dataset.
### Annotations
#### Annotation process
For each sentence, [Scrubadub](https://scrubadub.readthedocs.io/en/stable/) was used to detect:
- email addresses
- addresses and postal codes
- phone numbers
- credit card numbers
- US social security numbers
- vehicle plates numbers
- dates of birth
- URLs
- login credentials
#### Who are the annotators?
[Scrubadub](https://scrubadub.readthedocs.io/en/stable/)
### Personal and Sensitive Information
This dataset contains all PII that was originally contained in [The Pile](https://huggingface.co/datasets/the_pile), with all detected PII annotated.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset contains examples of real PII (conveniently annotated in the text!). Please take care to avoid misusing it or putting anybody in danger by publicizing their information.
This dataset is intended for research purposes only. We cannot guarantee that all PII has been detected, and we cannot guarantee that models trained using it will avoid generating PII.
We do not recommend deploying models trained on this data.
### Discussion of Biases
This dataset contains all biases from The Pile discussed in their paper: https://arxiv.org/abs/2101.00027
### Other Known Limitations
The PII in this dataset was detected using imperfect automated detection methods. We cannot guarantee that the labels are 100% accurate.
## Additional Information
### Dataset Curators
[The Pile](https://huggingface.co/datasets/the_pile)
### Licensing Information
From [The Pile](https://huggingface.co/datasets/the_pile): PubMed Central: [MIT License](https://github.com/EleutherAI/pile-pubmedcentral/blob/master/LICENSE)
### Citation Information
Paper information to be added
### Contributions
[The Pile](https://huggingface.co/datasets/the_pile) |
anonymizedauthor/paper_data | ---
license: cc-by-nc-sa-4.0
---
Lingustic features for 5 datasets.
UD relations: https://universaldependencies.org/ru/
Method for detecting reaction on frustration: http://dx.doi.org/10.1007/978-3-030-86855-0_2
Feature description:
feature --> description
punctuation_per_word --> Number of punctuation / Number of words
uppercase_rate --> Number of uppercase chars / Number of chars
mean_word_len --> Mean word lenght in chars
mean_sentence_len --> Mean sentence lenght in words
unique_words_rate --> Number of unique words / Number of words
verbs_1p_rate --> Number of first person verbs / Number of verbs
verbs_2p_rate --> Number of second person verbs / Number of verbs
verbs_3p_rate --> Number of third person verbs / Number of verbs
verbs_past_tense_rate --> Number of past tense verbs / Number of verbs
infinitives_rate --> Number of infinitive verbs / Number of verbs
pro_1p_rate --> Number of first person pronouns / Number of pronouns
pro_1p_sing_rate --> Number of first person singular pronouns / Number of pronouns
pro_1p_plural_rate --> Number of first person plural pronouns / Number of pronouns
pro_2p_rate --> Number of second person pronouns / Number of pronouns
pro_3p_rate --> Number of third person pronouns / Number of pronouns
trager_coef --> Number of verbs / Number of adjectives
logical_coh_coef --> (Number of conjunctions + Number of particles) / number of sentences * 3
verbs_per_nouns_coef --> Number of verbs / Number of nouns
participles_gerunds_coef --> Number of participles / Number of verbs
negation_rate --> Number of negative prefixes / Number of words
postag_A --> Number of A postags / Number of words
postag_ADV --> Number of ADV postags / Number of words
postag_ADVPRO --> Number of ADVPRO postags / Number of words
postag_ANUM --> Number of ANUM postags / Number of words
postag_APRO --> Number of APRO postags / Number of words
postag_COM --> Number of COM postags / Number of words
postag_CONJ --> Number of CONJ postags / Number of words
postag_INTJ --> Number of INTJ postags / Number of words
postag_NUM --> Number of NUM postags / Number of words
postag_PART --> Number of PART postags / Number of words
postag_PR --> Number of PR postags / Number of words
postag_S --> Number of S postags / Number of words
postag_SPRO --> Number of SPRO postags / Number of words
postag_V --> Number of V postags / Number of words
tgw_positive_assessment --> Dictionary: words related to positive assessment
tgw_positive_social --> Dictionary: words related to positive sociality
tgw_positive_emotions --> Dictionary: words related to positive emotions
tgw_negative_assessment --> Dictionary: words related to negative assessment
tgw_negative_social --> Dictionary: words related to negative sociality
tgw_negative_emotions --> Dictionary: words related to negative emotions
tgw_motivation_activity --> Dictionary: words related to motivation, activity and tension
tgw_cognitive_communication --> Dictionary: words related to cognitive activity and communication
tgw_destructive_activity --> Dictionary: words related to destructive activity
tgw_affect_lex --> Dictionary: affectogenic language
tgw_bodily_states_emotions --> Dictionary: words related to negative and passive emotions and bodily states
tgw_invectives --> Dictionary: invectives
tgw_soft_invectives --> Dictionary: soft invectives
tgw_obscene_lex --> Dictionary: obscene lexicon
tgw_youth_jargon --> Dictionary: youth jargon
tgw_hcs --> Dictionary: words related to housing and communal services
tgw_economics --> Dictionary: words related to exonomics
tgw_catastrophes --> Dictionary: words related to catastrophes
tgw_security_structures --> Dictionary: words related to security structures
tgw_healthcare_demography_ecology --> Dictionary: words related to healthcare, demography and ecology
tgw_authority --> Dictionary: words related to authority
be_disgust --> Dictionary: basic emotions of disgust
be_shame --> Dictionary: basic emotions of shame
be_anger --> Dictionary: basic emotions of anger
be_fear --> Dictionary: basic emotions of fear
be_sadness --> Dictionary: basic emotions of sadness
be_calm_excitement --> Dictionary: basic emotions of calm and excitement
be_happyness --> Dictionary: basic emotions of happyness
be_wonder --> Dictionary: basic emotions of wonder
ew_positive --> Dictionary: positive emotives
ew_negative --> Dictionary: negative emotives
ew_ambivalent --> Dictionary: ambivalent emotives
ew_de_emotives --> Dictionary: deemotives
sentiment_rate --> Sentiment score based on linis-crowd dictionary
max_synt_tree --> Max syntax tree lenght
min_synt_tree --> Min syntax tree lenght
mean_synt_tree --> Mean syntax tree lenght
flat:foreign: --> Number of UD relations normilized by Number of words
csubj --> Number of UD relations normilized by Number of words
acl --> Number of UD relations normilized by Number of words
acl:relcl --> Number of UD relations normilized by Number of words
advcl --> Number of UD relations normilized by Number of words
advmod --> Number of UD relations normilized by Number of words
amod --> Number of UD relations normilized by Number of words
appos --> Number of UD relations normilized by Number of words
aux --> Number of UD relations normilized by Number of words
aux:pass --> Number of UD relations normilized by Number of words
case --> Number of UD relations normilized by Number of words
cc --> Number of UD relations normilized by Number of words
cc:preconj --> Number of UD relations normilized by Number of words
ccomp --> Number of UD relations normilized by Number of words
conj --> Number of UD relations normilized by Number of words
cop --> Number of UD relations normilized by Number of words
det --> Number of UD relations normilized by Number of words
discourse --> Number of UD relations normilized by Number of words
fixed --> Number of UD relations normilized by Number of words
flat --> Number of UD relations normilized by Number of words
goeswith --> Number of UD relations normilized by Number of words
iobj --> Number of UD relations normilized by Number of words
list --> Number of UD relations normilized by Number of words
mark --> Number of UD relations normilized by Number of words
nmod --> Number of UD relations normilized by Number of words
nsubj --> Number of UD relations normilized by Number of words
nsubj:pass --> Number of UD relations normilized by Number of words
nummod --> Number of UD relations normilized by Number of words
nummod:gov --> Number of UD relations normilized by Number of words
obj --> Number of UD relations normilized by Number of words
obl --> Number of UD relations normilized by Number of words
orphan --> Number of UD relations normilized by Number of words
parataxis --> Number of UD relations normilized by Number of words
punct --> Number of UD relations normilized by Number of words
root --> Number of UD relations normilized by Number of words
xcomp --> Number of UD relations normilized by Number of words
compound --> Number of UD relations normilized by Number of words
flat:foreign --> Number of UD relations normilized by Number of words
E_group --> Reaction on frustration: E type
M_group --> Reaction on frustration: M type
I_group --> Reaction on frustration: I type
inf_group --> Reaction on frustration: no reaction |
kaleemWaheed/twitter_dataset_1713066739 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11591
num_examples: 25
download_size: 8672
dataset_size: 11591
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
teragron/wikitr | ---
license: mit
language:
- tr
pretty_name: wtr
size_categories:
- 1K<n<10K
--- |
Sharathhebbar24/awesome_chatgpt_prompts_kannada | ---
dataset_info:
features:
- name: act
dtype: string
- name: prompt
dtype: string
- name: kannada_prompt
dtype: string
splits:
- name: train
num_bytes: 282163
num_examples: 153
download_size: 122602
dataset_size: 282163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- translation
- question-answering
- text-generation
language:
- en
- kn
tags:
- kannada
size_categories:
- n<1K
---
Kannada translation of [fka/awesome-chatgpt-prompts](https://huggingface.co/datasets/fka/awesome-chatgpt-prompts) |
kgr123/quality_mcqa_3 | ---
dataset_info:
features:
- name: document_id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: context_orig
dtype: string
- name: token_soft_limit_deberta
dtype: int64
- name: len_soft_limit
dtype: int64
- name: context
dtype: string
- name: questions
dtype: string
- name: insertion_labels
dtype: string
- name: query
dtype: string
- name: option_0
dtype: string
- name: option_1
dtype: string
- name: option_2
dtype: string
- name: option_3
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 245043661.60689655
num_examples: 1732
- name: validation
num_bytes: 52765838.61891892
num_examples: 367
- name: test
num_bytes: 52704971.17297297
num_examples: 367
download_size: 145211266
dataset_size: 350514471.39878845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
TwoAbove/midjourney-messages | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: channel_id
dtype: string
- name: content
dtype: string
- name: timestamp
dtype: string
- name: image_id
dtype: string
- name: image
dtype: image
- name: url
dtype: string
- name: height
dtype: int64
- name: width
dtype: int64
- name: size
dtype: int64
splits:
- name: train
num_bytes: 0
num_examples: 0
configs:
- config_name: default
data_files:
- split: train
path: data/*
---
# midjourney-messages
## Description
This dataset contains the raw messages from Midjourney.
Initial dataset is https://huggingface.co/datasets/vivym/midjourney-messages, but this one has the images attached.
|
comet-team/mastodon-instances | ---
dataset_info:
features:
- name: name
dtype: string
- name: title
dtype: string
- name: short_description
dtype: string
- name: description
dtype: string
- name: uptime
dtype: float64
- name: up
dtype: bool
- name: https_score
dtype: int64
- name: https_rank
dtype: string
- name: ipv6
dtype: bool
- name: openRegistrations
dtype: bool
- name: users
dtype: int64
- name: statuses
dtype: string
- name: connections
dtype: int64
splits:
- name: train
num_bytes: 816425
num_examples: 1868
download_size: 536440
dataset_size: 816425
---
# Dataset Card for "mastodon-instances"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuwenlwl/longke | ---
license: mit
---
|
GenVRadmin/Samvaad-Mixed-Language | ---
license: mit
---
|
yzhuang/metatree_pokerhand | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: uint8
- name: y
dtype: int64
splits:
- name: train
num_bytes: 14510800
num_examples: 580432
- name: validation
num_bytes: 6219225
num_examples: 248769
download_size: 7116755
dataset_size: 20730025
---
# Dataset Card for "metatree_pokerhand"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hpprc/mqa-ja | ---
language:
- ja
license: cc0-1.0
dataset_info:
- config_name: collection
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5404867793
num_examples: 11852254
download_size: 3269616864
dataset_size: 5404867793
- config_name: dataset
features:
- name: anc
dtype: string
- name: pos_ids
sequence: int64
- name: neg_ids
sequence: 'null'
splits:
- name: train
num_bytes: 1725169456
num_examples: 5826275
download_size: 854583745
dataset_size: 1725169456
configs:
- config_name: collection
data_files:
- split: train
path: collection/train-*
- config_name: dataset
data_files:
- split: train
path: dataset/train-*
---
[mqa](https://huggingface.co/datasets/clips/mqa/viewer/ja-all-question)データセットのquery--passageのペアについて重複を削除したデータセットです。
元データ中のノイジーなテキストのクリーニングやNFKC正規化などの前処理を行ってあります。
`dataset` subsetの`pos_ids`および`neg_ids`中のidは、`collection`subsetのインデックス番号に対応しています。
したがって、`collection[pos_id]`のようにアクセスしてもらえれば所望のデータを得ることができます。
ライセンスは元データセットに従います。
|
anyspeech/frame_labels | ---
dataset_info:
features:
- name: converted_phonetic_detail
struct:
- name: start
sequence: float64
- name: stop
sequence: float64
- name: utterance
sequence: string
- name: dialect_region
dtype: string
- name: file
dtype: string
- name: frame_labels
sequence: string
- name: id
dtype: string
- name: merge_phonetic_detail
struct:
- name: start
sequence: float64
- name: stop
sequence: float64
- name: utterance
sequence: string
- name: phonetic_detail
sequence:
- name: start
dtype: int64
- name: stop
dtype: int64
- name: utterance
dtype: string
- name: sentence_type
dtype: string
- name: speaker_id
dtype: string
- name: text
dtype: string
- name: word_detail
struct:
- name: start
sequence: float64
- name: stop
sequence: float64
- name: utterance
sequence: string
- name: frame_labels_10ms
sequence: string
- name: audio
struct:
- name: array
sequence: float64
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 1843096543
num_examples: 4620
- name: test
num_bytes: 673493381
num_examples: 1680
download_size: 558422047
dataset_size: 2516589924
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
monicaeme/somos-alpaca-es | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: 1-instruction
dtype: string
- name: 2-input
dtype: string
- name: 3-output
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: input
sequence: float64
- name: instruction
sequence: float64
- name: output
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 1920697
num_examples: 102
download_size: 0
dataset_size: 1920697
---
# Dataset Card for "somos-alpaca-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RealTimeData/bbc_images_alltime | ---
dataset_info:
- config_name: 2017-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 122443326.504
num_examples: 1688
download_size: 123150214
dataset_size: 122443326.504
- config_name: 2017-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 102583557.641
num_examples: 1469
download_size: 102621580
dataset_size: 102583557.641
- config_name: 2017-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 47392374.0
num_examples: 721
download_size: 0
dataset_size: 47392374.0
- config_name: 2017-04
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 51586742.0
num_examples: 807
download_size: 0
dataset_size: 51586742.0
- config_name: 2017-05
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 49729114.0
num_examples: 756
download_size: 49449289
dataset_size: 49729114.0
- config_name: 2017-06
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 101940444.214
num_examples: 1106
download_size: 99929261
dataset_size: 101940444.214
- config_name: 2017-07
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 106945858.75
num_examples: 1139
download_size: 107313303
dataset_size: 106945858.75
- config_name: 2017-08
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 100514218.575
num_examples: 1113
download_size: 0
dataset_size: 100514218.575
- config_name: 2017-09
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 111890945.259
num_examples: 1199
download_size: 109931209
dataset_size: 111890945.259
- config_name: 2017-10
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 110331938.63
num_examples: 1187
download_size: 107643658
dataset_size: 110331938.63
- config_name: 2017-11
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 126967573.77
num_examples: 1443
download_size: 125743771
dataset_size: 126967573.77
- config_name: 2017-12
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 115994458.002
num_examples: 1294
download_size: 114829893
dataset_size: 115994458.002
- config_name: 2018-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 118540155.49
num_examples: 1323
download_size: 117509146
dataset_size: 118540155.49
- config_name: 2018-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 117797012.007
num_examples: 1223
download_size: 111594833
dataset_size: 117797012.007
- config_name: 2018-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 109050223.68
num_examples: 1280
download_size: 108054338
dataset_size: 109050223.68
- config_name: 2018-04
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 127060957.288
num_examples: 1328
download_size: 0
dataset_size: 127060957.288
- config_name: 2018-05
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 115683290.224
num_examples: 1334
download_size: 116119560
dataset_size: 115683290.224
- config_name: 2018-06
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 96671553.698
num_examples: 1189
download_size: 96349655
dataset_size: 96671553.698
- config_name: 2018-07
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 130703350.32
num_examples: 1496
download_size: 129730979
dataset_size: 130703350.32
- config_name: 2018-08
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 115238413.428
num_examples: 1253
download_size: 114020376
dataset_size: 115238413.428
- config_name: 2018-09
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 112634923.633
num_examples: 1277
download_size: 112185186
dataset_size: 112634923.633
- config_name: 2018-10
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 109628565.494
num_examples: 1249
download_size: 108625160
dataset_size: 109628565.494
- config_name: 2018-11
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 117169704.96
num_examples: 1290
download_size: 118003238
dataset_size: 117169704.96
- config_name: 2018-12
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 101776799.8
num_examples: 1138
download_size: 100265438
dataset_size: 101776799.8
- config_name: 2019-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 108770178.24
num_examples: 1240
download_size: 108809412
dataset_size: 108770178.24
- config_name: 2019-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 106230713.004
num_examples: 1214
download_size: 104176974
dataset_size: 106230713.004
- config_name: 2019-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 125389025.589
num_examples: 1333
download_size: 0
dataset_size: 125389025.589
- config_name: 2019-04
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 120403617.8
num_examples: 1280
download_size: 0
dataset_size: 120403617.8
- config_name: 2019-05
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 125960887.126
num_examples: 1369
download_size: 124276084
dataset_size: 125960887.126
- config_name: 2019-06
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 136392438.664
num_examples: 1348
download_size: 135672371
dataset_size: 136392438.664
- config_name: 2019-07
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 124935466.248
num_examples: 1366
download_size: 124602238
dataset_size: 124935466.248
- config_name: 2019-08
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 111433073.671
num_examples: 1219
download_size: 107134716
dataset_size: 111433073.671
- config_name: 2019-09
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 110837648.736
num_examples: 1256
download_size: 108585292
dataset_size: 110837648.736
- config_name: 2019-10
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 125283502.331
num_examples: 1271
download_size: 0
dataset_size: 125283502.331
- config_name: 2019-11
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 114959215.775
num_examples: 1275
download_size: 109114628
dataset_size: 114959215.775
- config_name: 2019-12
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 111162509.392
num_examples: 1304
download_size: 110040161
dataset_size: 111162509.392
- config_name: 2020-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 128033419.15
num_examples: 1230
download_size: 126141589
dataset_size: 128033419.15
- config_name: 2020-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 115035598.574
num_examples: 1197
download_size: 115310651
dataset_size: 115035598.574
- config_name: 2020-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 106606945.448
num_examples: 1156
download_size: 0
dataset_size: 106606945.448
- config_name: 2020-04
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 100981933.728
num_examples: 1152
download_size: 0
dataset_size: 100981933.728
- config_name: 2020-05
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 110835151.6
num_examples: 1257
download_size: 108012913
dataset_size: 110835151.6
- config_name: 2020-06
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 108239718.802
num_examples: 1231
download_size: 106030989
dataset_size: 108239718.802
- config_name: 2020-07
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 126564071.054
num_examples: 1302
download_size: 121234236
dataset_size: 126564071.054
- config_name: 2020-08
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 108827972.6
num_examples: 1240
download_size: 106795246
dataset_size: 108827972.6
- config_name: 2020-09
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 137288418.524
num_examples: 1199
download_size: 127316972
dataset_size: 137288418.524
- config_name: 2020-10
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 141209470.562
num_examples: 1298
download_size: 0
dataset_size: 141209470.562
- config_name: 2020-11
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 123343537.459
num_examples: 1297
download_size: 117797111
dataset_size: 123343537.459
- config_name: 2020-12
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 129561685.87
num_examples: 1186
download_size: 125121403
dataset_size: 129561685.87
- config_name: 2021-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 156547452.065
num_examples: 1365
download_size: 146649533
dataset_size: 156547452.065
- config_name: 2021-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 137506074.088
num_examples: 1368
download_size: 132663295
dataset_size: 137506074.088
- config_name: 2021-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 131466965.02
num_examples: 1321
download_size: 0
dataset_size: 131466965.02
- config_name: 2021-04
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 140977723.04
num_examples: 1320
download_size: 0
dataset_size: 140977723.04
- config_name: 2021-05
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 146883988.472
num_examples: 1264
download_size: 138956660
dataset_size: 146883988.472
- config_name: 2021-06
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 145262842.075
num_examples: 1367
download_size: 141442160
dataset_size: 145262842.075
- config_name: 2021-07
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 155767742.626
num_examples: 1486
download_size: 155846129
dataset_size: 155767742.626
- config_name: 2021-08
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 155819324.92
num_examples: 1381
download_size: 157101336
dataset_size: 155819324.92
- config_name: 2021-09
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 136990935.044
num_examples: 1429
download_size: 134893440
dataset_size: 136990935.044
- config_name: 2021-10
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 150938276.384
num_examples: 1474
download_size: 147195524
dataset_size: 150938276.384
- config_name: 2021-11
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 140033960.728
num_examples: 1461
download_size: 0
dataset_size: 140033960.728
- config_name: 2021-12
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 147085138.232
num_examples: 1344
download_size: 139674835
dataset_size: 147085138.232
- config_name: 2022-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 157639017.352
num_examples: 1404
download_size: 150370010
dataset_size: 157639017.352
- config_name: 2022-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 160343907.725
num_examples: 1405
download_size: 160789713
dataset_size: 160343907.725
- config_name: 2022-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 173057850.04
num_examples: 1440
download_size: 0
dataset_size: 173057850.04
- config_name: 2022-04
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 161817777.72
num_examples: 1436
download_size: 0
dataset_size: 161817777.72
- config_name: 2022-05
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 152861222.783
num_examples: 1357
download_size: 144151531
dataset_size: 152861222.783
- config_name: 2022-06
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 155906985.904
num_examples: 1479
download_size: 152209912
dataset_size: 155906985.904
- config_name: 2022-07
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 174086233.52
num_examples: 1445
download_size: 174793110
dataset_size: 174086233.52
- config_name: 2022-08
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 144507245.265
num_examples: 1281
download_size: 141608383
dataset_size: 144507245.265
- config_name: 2022-09
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 220240636.862
num_examples: 1538
download_size: 220451624
dataset_size: 220240636.862
- config_name: 2022-10
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 157358453.01
num_examples: 1394
download_size: 149329234
dataset_size: 157358453.01
- config_name: 2022-11
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 163897645.91
num_examples: 1630
download_size: 161735514
dataset_size: 163897645.91
- config_name: 2022-12
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 174900822.864
num_examples: 1647
download_size: 165712871
dataset_size: 174900822.864
- config_name: 2023-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 188584311.627
num_examples: 1623
download_size: 186501700
dataset_size: 188584311.627
- config_name: 2023-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 184023573.872
num_examples: 1588
download_size: 175704980
dataset_size: 184023573.872
- config_name: 2023-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 190227193.1
num_examples: 1590
download_size: 0
dataset_size: 190227193.1
- config_name: 2023-04
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 180919389.272
num_examples: 1672
download_size: 0
dataset_size: 180919389.272
- config_name: 2023-05
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 209876602.552
num_examples: 1746
download_size: 220487583
dataset_size: 209876602.552
- config_name: 2023-06
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 201399691.026
num_examples: 1674
download_size: 188589435
dataset_size: 201399691.026
- config_name: 2023-07
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 201701187.76
num_examples: 1694
download_size: 185009875
dataset_size: 201701187.76
- config_name: 2023-08
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 184269420.23
num_examples: 1715
download_size: 178141669
dataset_size: 184269420.23
- config_name: 2023-09
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 161634889.2
num_examples: 1661
download_size: 162707652
dataset_size: 161634889.2
- config_name: 2023-10
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 193440351.04
num_examples: 1680
download_size: 190638289
dataset_size: 193440351.04
- config_name: 2023-11
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 167218244.95
num_examples: 1575
download_size: 158769063
dataset_size: 167218244.95
- config_name: 2023-12
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 177898899.32
num_examples: 1460
download_size: 180835697
dataset_size: 177898899.32
- config_name: 2024-01
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 190376586.82
num_examples: 1562
download_size: 174435217
dataset_size: 190376586.82
- config_name: 2024-02
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 252991022.495
num_examples: 2017
download_size: 253947493
dataset_size: 252991022.495
- config_name: 2024-03
features:
- name: url
dtype: string
- name: img
dtype: image
- name: title
dtype: string
splits:
- name: train
num_bytes: 366282766.68
num_examples: 3470
download_size: 351095375
dataset_size: 366282766.68
configs:
- config_name: 2017-01
data_files:
- split: train
path: 2017-01/train-*
- config_name: 2017-02
data_files:
- split: train
path: 2017-02/train-*
- config_name: 2017-03
data_files:
- split: train
path: 2017-03/train-*
- config_name: 2017-04
data_files:
- split: train
path: 2017-04/train-*
- config_name: 2017-05
data_files:
- split: train
path: 2017-05/train-*
- config_name: 2017-06
data_files:
- split: train
path: 2017-06/train-*
- config_name: 2017-07
data_files:
- split: train
path: 2017-07/train-*
- config_name: 2017-08
data_files:
- split: train
path: 2017-08/train-*
- config_name: 2017-09
data_files:
- split: train
path: 2017-09/train-*
- config_name: 2017-10
data_files:
- split: train
path: 2017-10/train-*
- config_name: 2017-11
data_files:
- split: train
path: 2017-11/train-*
- config_name: 2017-12
data_files:
- split: train
path: 2017-12/train-*
- config_name: 2018-01
data_files:
- split: train
path: 2018-01/train-*
- config_name: 2018-02
data_files:
- split: train
path: 2018-02/train-*
- config_name: 2018-03
data_files:
- split: train
path: 2018-03/train-*
- config_name: 2018-04
data_files:
- split: train
path: 2018-04/train-*
- config_name: 2018-05
data_files:
- split: train
path: 2018-05/train-*
- config_name: 2018-06
data_files:
- split: train
path: 2018-06/train-*
- config_name: 2018-07
data_files:
- split: train
path: 2018-07/train-*
- config_name: 2018-08
data_files:
- split: train
path: 2018-08/train-*
- config_name: 2018-09
data_files:
- split: train
path: 2018-09/train-*
- config_name: 2018-10
data_files:
- split: train
path: 2018-10/train-*
- config_name: 2018-11
data_files:
- split: train
path: 2018-11/train-*
- config_name: 2018-12
data_files:
- split: train
path: 2018-12/train-*
- config_name: 2019-01
data_files:
- split: train
path: 2019-01/train-*
- config_name: 2019-02
data_files:
- split: train
path: 2019-02/train-*
- config_name: 2019-03
data_files:
- split: train
path: 2019-03/train-*
- config_name: 2019-04
data_files:
- split: train
path: 2019-04/train-*
- config_name: 2019-05
data_files:
- split: train
path: 2019-05/train-*
- config_name: 2019-06
data_files:
- split: train
path: 2019-06/train-*
- config_name: 2019-07
data_files:
- split: train
path: 2019-07/train-*
- config_name: 2019-08
data_files:
- split: train
path: 2019-08/train-*
- config_name: 2019-09
data_files:
- split: train
path: 2019-09/train-*
- config_name: 2019-10
data_files:
- split: train
path: 2019-10/train-*
- config_name: 2019-11
data_files:
- split: train
path: 2019-11/train-*
- config_name: 2019-12
data_files:
- split: train
path: 2019-12/train-*
- config_name: 2020-01
data_files:
- split: train
path: 2020-01/train-*
- config_name: 2020-02
data_files:
- split: train
path: 2020-02/train-*
- config_name: 2020-03
data_files:
- split: train
path: 2020-03/train-*
- config_name: 2020-04
data_files:
- split: train
path: 2020-04/train-*
- config_name: 2020-05
data_files:
- split: train
path: 2020-05/train-*
- config_name: 2020-06
data_files:
- split: train
path: 2020-06/train-*
- config_name: 2020-07
data_files:
- split: train
path: 2020-07/train-*
- config_name: 2020-08
data_files:
- split: train
path: 2020-08/train-*
- config_name: 2020-09
data_files:
- split: train
path: 2020-09/train-*
- config_name: 2020-10
data_files:
- split: train
path: 2020-10/train-*
- config_name: 2020-11
data_files:
- split: train
path: 2020-11/train-*
- config_name: 2020-12
data_files:
- split: train
path: 2020-12/train-*
- config_name: 2021-01
data_files:
- split: train
path: 2021-01/train-*
- config_name: 2021-02
data_files:
- split: train
path: 2021-02/train-*
- config_name: 2021-03
data_files:
- split: train
path: 2021-03/train-*
- config_name: 2021-04
data_files:
- split: train
path: 2021-04/train-*
- config_name: 2021-05
data_files:
- split: train
path: 2021-05/train-*
- config_name: 2021-06
data_files:
- split: train
path: 2021-06/train-*
- config_name: 2021-07
data_files:
- split: train
path: 2021-07/train-*
- config_name: 2021-08
data_files:
- split: train
path: 2021-08/train-*
- config_name: 2021-09
data_files:
- split: train
path: 2021-09/train-*
- config_name: 2021-10
data_files:
- split: train
path: 2021-10/train-*
- config_name: 2021-11
data_files:
- split: train
path: 2021-11/train-*
- config_name: 2021-12
data_files:
- split: train
path: 2021-12/train-*
- config_name: 2022-01
data_files:
- split: train
path: 2022-01/train-*
- config_name: 2022-02
data_files:
- split: train
path: 2022-02/train-*
- config_name: 2022-03
data_files:
- split: train
path: 2022-03/train-*
- config_name: 2022-04
data_files:
- split: train
path: 2022-04/train-*
- config_name: 2022-05
data_files:
- split: train
path: 2022-05/train-*
- config_name: 2022-06
data_files:
- split: train
path: 2022-06/train-*
- config_name: 2022-07
data_files:
- split: train
path: 2022-07/train-*
- config_name: 2022-08
data_files:
- split: train
path: 2022-08/train-*
- config_name: 2022-09
data_files:
- split: train
path: 2022-09/train-*
- config_name: 2022-10
data_files:
- split: train
path: 2022-10/train-*
- config_name: 2022-11
data_files:
- split: train
path: 2022-11/train-*
- config_name: 2022-12
data_files:
- split: train
path: 2022-12/train-*
- config_name: 2023-01
data_files:
- split: train
path: 2023-01/train-*
- config_name: 2023-02
data_files:
- split: train
path: 2023-02/train-*
- config_name: 2023-03
data_files:
- split: train
path: 2023-03/train-*
- config_name: 2023-04
data_files:
- split: train
path: 2023-04/train-*
- config_name: 2023-05
data_files:
- split: train
path: 2023-05/train-*
- config_name: 2023-06
data_files:
- split: train
path: 2023-06/train-*
- config_name: 2023-07
data_files:
- split: train
path: 2023-07/train-*
- config_name: 2023-08
data_files:
- split: train
path: 2023-08/train-*
- config_name: 2023-09
data_files:
- split: train
path: 2023-09/train-*
- config_name: 2023-10
data_files:
- split: train
path: 2023-10/train-*
- config_name: 2023-11
data_files:
- split: train
path: 2023-11/train-*
- config_name: 2023-12
data_files:
- split: train
path: 2023-12/train-*
- config_name: 2024-01
data_files:
- split: train
path: 2024-01/train-*
- config_name: 2024-02
data_files:
- split: train
path: 2024-02/train-*
- config_name: 2024-03
data_files:
- split: train
path: 2024-03/train-*
---
# Dataset Card for "bbc_images_alltime"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yardeny/processed_t5_context_len_512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 17763634104.0
num_examples: 6917303
download_size: 6975018960
dataset_size: 17763634104.0
---
# Dataset Card for "processed_t5_context_len_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_posicube__Llama-chat-AY-13B | ---
pretty_name: Evaluation run of posicube/Llama-chat-AY-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [posicube/Llama-chat-AY-13B](https://huggingface.co/posicube/Llama-chat-AY-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_posicube__Llama-chat-AY-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T00:19:13.486844](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama-chat-AY-13B/blob/main/results_2023-10-24T00-19-13.486844.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10706795302013423,\n\
\ \"em_stderr\": 0.0031664935381812975,\n \"f1\": 0.21251572986577122,\n\
\ \"f1_stderr\": 0.003428235498166665,\n \"acc\": 0.4402889467457888,\n\
\ \"acc_stderr\": 0.010504223854749877\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10706795302013423,\n \"em_stderr\": 0.0031664935381812975,\n\
\ \"f1\": 0.21251572986577122,\n \"f1_stderr\": 0.003428235498166665\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12130401819560273,\n \
\ \"acc_stderr\": 0.00899288849727558\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224174\n\
\ }\n}\n```"
repo_url: https://huggingface.co/posicube/Llama-chat-AY-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T00_19_13.486844
path:
- '**/details_harness|drop|3_2023-10-24T00-19-13.486844.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T00-19-13.486844.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T00_19_13.486844
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-19-13.486844.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-19-13.486844.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T00_19_13.486844
path:
- '**/details_harness|winogrande|5_2023-10-24T00-19-13.486844.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T00-19-13.486844.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- results_2023-10-04T02-16-36.083173.parquet
- split: 2023_10_24T00_19_13.486844
path:
- results_2023-10-24T00-19-13.486844.parquet
- split: latest
path:
- results_2023-10-24T00-19-13.486844.parquet
---
# Dataset Card for Evaluation run of posicube/Llama-chat-AY-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/posicube/Llama-chat-AY-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [posicube/Llama-chat-AY-13B](https://huggingface.co/posicube/Llama-chat-AY-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_posicube__Llama-chat-AY-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T00:19:13.486844](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama-chat-AY-13B/blob/main/results_2023-10-24T00-19-13.486844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10706795302013423,
"em_stderr": 0.0031664935381812975,
"f1": 0.21251572986577122,
"f1_stderr": 0.003428235498166665,
"acc": 0.4402889467457888,
"acc_stderr": 0.010504223854749877
},
"harness|drop|3": {
"em": 0.10706795302013423,
"em_stderr": 0.0031664935381812975,
"f1": 0.21251572986577122,
"f1_stderr": 0.003428235498166665
},
"harness|gsm8k|5": {
"acc": 0.12130401819560273,
"acc_stderr": 0.00899288849727558
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224174
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rongzhangibm/NaturalQuestionsV2 | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
pretty_name: Natural Questions
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
paperswithcode_id: natural-questions
---
# Dataset Card for Natural Questions
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://ai.google.com/research/NaturalQuestions/dataset](https://ai.google.com/research/NaturalQuestions/dataset)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 42981 MB
- **Size of the generated dataset:** 139706 MB
- **Total amount of disk used:** 182687 MB
### Dataset Summary
The NQ corpus contains questions from real users, and it requires QA systems to
read and comprehend an entire Wikipedia article that may or may not contain the
answer to the question. The inclusion of real user questions, and the
requirement that solutions should read an entire page to find the answer, cause
NQ to be a more realistic and challenging task than prior QA datasets.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 42981 MB
- **Size of the generated dataset:** 139706 MB
- **Total amount of disk used:** 182687 MB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### default
```
"id": datasets.Value("string"),
"document": {
"title": datasets.Value("string"),
"url": datasets.Value("string"),
"html": datasets.Value("string"),
"tokens": datasets.features.Sequence(
{
"token": datasets.Value("string"),
"is_html": datasets.Value("bool"),
"start_byte": datasets.Value("int64"),
"end_byte": datasets.Value("int64"),
}
),
},
"question": {
"text": datasets.Value("string"),
"tokens": datasets.features.Sequence(datasets.Value("string")),
},
"long_answer_candidates": datasets.features.Sequence(
{
"start_token": datasets.Value("int64"),
"end_token": datasets.Value("int64"),
"start_byte": datasets.Value("int64"),
"end_byte": datasets.Value("int64"),
"top_level": datasets.Value("bool"),
}
),
"annotations": datasets.features.Sequence(
{
"id": datasets.Value("string"),
"long_answer": {
"start_token": datasets.Value("int64"),
"end_token": datasets.Value("int64"),
"start_byte": datasets.Value("int64"),
"end_byte": datasets.Value("int64"),
"candidate_index": datasets.Value("int64")
},
"short_answers": datasets.features.Sequence(
{
"start_token": datasets.Value("int64"),
"end_token": datasets.Value("int64"),
"start_byte": datasets.Value("int64"),
"end_byte": datasets.Value("int64"),
"text": datasets.Value("string"),
}
),
"yes_no_answer": datasets.features.ClassLabel(
names=["NO", "YES"]
), # Can also be -1 for NONE.
}
)
```
### Data Splits
| name | train | validation |
|---------|-------:|-----------:|
| default | 307373 | 7830 |
| dev | N/A | 7830 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[Creative Commons Attribution-ShareAlike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/).
### Citation Information
```
@article{47761,
title = {Natural Questions: a Benchmark for Question Answering Research},
author = {Tom Kwiatkowski and Jennimaria Palomaki and Olivia Redfield and Michael Collins and Ankur Parikh and Chris Alberti and Danielle Epstein and Illia Polosukhin and Matthew Kelcey and Jacob Devlin and Kenton Lee and Kristina N. Toutanova and Llion Jones and Ming-Wei Chang and Andrew Dai and Jakob Uszkoreit and Quoc Le and Slav Petrov},
year = {2019},
journal = {Transactions of the Association of Computational Linguistics}
}
```
### Contributions
|
ShrinivasSK/hi_en_1 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: tgt
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 6349061.7
num_examples: 18000
- name: test
num_bytes: 705451.3
num_examples: 2000
download_size: 3779852
dataset_size: 7054513.0
---
# Dataset Card for "hi_en_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
XXCCF/bridge_construction | ---
license: gpl-3.0
language:
- zh
tags:
- civil engineering
size_categories:
- 100K<n<1M
---
研究把桥梁施工相关知识做成一个训练数据集,计划包含
1、桥梁施工、设计相关规范
2、桥梁施工白问
3、桥梁施工组织设计
4、桥梁分部、专项施工方案
5、桥梁施工机械
6、大桥局企业标准
7、大临结构计算书
9、 |
roa7n/patched_1000_test_p_40_m1_predictions | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: m1_preds
dtype: float32
splits:
- name: train
num_bytes: 643791182
num_examples: 1663294
download_size: 60859409
dataset_size: 643791182
---
# Dataset Card for "patched_1000_test_p_40_m1_predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khhuang/CHOCOLATE | ---
annotations_creators:
- expert-generated
- found
language_creators:
- expert-generated
- found
language:
- en
license: apache-2.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
paperswithcode_id: chocolate
pretty_name: CHOCOLATE
tags:
- chart
- plot
- chart-to-text
- vistext
- statista
- pew
- chart-understanding
- chart-captioning
- chart-summarization
- document-image
configs:
- config_name: default
data_files:
- split: test
path: chocolate.json
---
# Dataset Card for CHOCOLATE
- [Dataset Description](https://huggingface.co/datasets/khhuang/CHOCOLATE/blob/main/README.md#dataset-description)
- [Paper Information](https://huggingface.co/datasets/khhuang/CHOCOLATE/blob/main/README.md#paper-information)
- [Citation](https://huggingface.co/datasets/khhuang/CHOCOLATE/blob/main/README.md#citation)
## Dataset Description
**CHOCOLATE** is a benchmark for detecting and correcting factual inconsistency in generated chart captions. It consists of captions produced by six most advanced models, which are categorized into three subsets:
- **LVLM**: GPT-4V, Bard (before Gemini)
- **LLM-based Pipeline**: DePlot + GPT-4
- **Fine-tuned Model**: ChartT5, MatCha, UniChart
The charts are from two datasets: VisText and the Pew split of Chart-to-Text. In total, **CHOCOLATE** consists of **1,187 examples**. Each instance in **CHOCOLATE** consists of a caption generated by one of the model and the annotations of the factual errors for each caption sentence.
## Paper Information
- Paper: https://arxiv.org/abs/2312.10160
- Code: https://github.com/khuangaf/CHOCOLATE/
- Project: https://khuangaf.github.io/CHOCOLATE
## Citation
If you use the **CHOCOLATE** dataset in your work, please kindly cite the paper using this BibTeX:
```
@misc{huang-etal-2023-do,
title = "Do LVLMs Understand Charts? Analyzing and Correcting Factual Errors in Chart Captioning",
author = "Huang, Kung-Hsiang and
Zhou, Mingyang and
Chan, Hou Pong and
Fung, Yi R. and
Wang, Zhenhailong and
Zhang, Lingyu and
Chang, Shih-Fu and
Ji, Heng",
year={2023},
eprint={2312.10160},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
alexrosen45/dogs | ---
license: apache-2.0
task_categories:
- image-classification
pretty_name: Stanford dogs and other dogs dataset
size_categories:
- 10K<n<100K
--- |
awacke1/NPI-20240107 | ---
license: mit
---
NPI and Identification
🆔 NPI: National Provider Identifier, a unique identification number for covered health care providers.
🧑 EntityTypeCode: Indicates whether the provider is an individual (1) or an organization (2).
🔁 ReplacementNPI: NPI that replaces a previous NPI, if applicable.
💼 EmployerIdentificationNumberEIN: Tax identification number for the provider, if they are an organization.
Provider Names and Credentials
🏢 ProviderOrganizationNameLegalBusinessName: Legal business name of the provider, if an organization.
👨👩👧 ProviderLastNameLegalName: Last (family) name of the provider, if an individual.
📛 ProviderFirstName: First (given) name of the provider, if an individual.
🌟 ProviderMiddleName: Middle name of the provider, if applicable.
📌 ProviderNamePrefixText: Prefix to the provider's name (e.g., Dr., Mr., Ms.).
🏷️ ProviderNameSuffixText: Suffix to the provider's name (e.g., Jr., Sr., III).
🎓 ProviderCredentialText: Credentials of the provider (e.g., MD, DDS, RN).
Other Provider Information
🏥 ProviderOtherOrganizationName: Other organization name used by the provider.
🔠 ProviderOtherOrganizationNameTypeCode: Type code for the other organization name.
🔄 ProviderOtherLastName: Other last name used by the provider.
➡️ ProviderOtherFirstName: Other first name used by the provider.
🆗 ProviderOtherMiddleName: Other middle name used by the provider.
🔼 ProviderOtherNamePrefixText: Other prefix to the provider's name.
🔽 ProviderOtherNameSuffixText: Other suffix to the provider's name.
📜 ProviderOtherCredentialText: Other credentials used by the provider.
🈯 ProviderOtherLastNameTypeCode: Type code for the other last name used.
Business Mailing Address
📫 ProviderFirstLineBusinessMailingAddress: First line of the provider's business mailing address.
📬 ProviderSecondLineBusinessMailingAddress: Second line of the provider's business mailing address.
🏙️ ProviderBusinessMailingAddressCityName: City name of the provider's business mailing address.
📍 ProviderBusinessMailingAddressStateName: State name of the provider's business mailing address.
📮 ProviderBusinessMailingAddressPostalCode: Postal code of the provider's business mailing address.
🌍 ProviderBusinessMailingAddressCountryCodeIfoutsideUS: Country code if outside the U.S.
📞 ProviderBusinessMailingAddressTelephoneNumber: Telephone number for the business mailing address.
📠 ProviderBusinessMailingAddressFaxNumber: Fax number for the business mailing address.
Business Practice Location Address
🏠 ProviderFirstLineBusinessPracticeLocationAddress: First line of the provider's business practice location address.
🏡 ProviderSecondLineBusinessPracticeLocationAddress: Second line of the provider's business practice location address.
🌆 ProviderBusinessPracticeLocationAddressCityName: City name of the provider's practice location.
🗺️ ProviderBusinessPracticeLocationAddressStateName: State name of the provider's practice location.
🛂 ProviderBusinessPracticeLocationAddressPostalCode: Postal code of the provider's practice location.
🌏 ProviderBusinessPracticeLocationAddressCountryCodeIfoutsideUS: Country code if the practice location is outside the U.S.
📲 ProviderBusinessPracticeLocationAddressTelephoneNumber: Telephone number for the practice location.
🖨️ ProviderBusinessPracticeLocationAddressFaxNumber: Fax number for the practice location.
Dates and Status
📅 ProviderEnumerationDate: The date the provider was first added to the NPI registry.
🔄 LastUpdateDate: The date of the last update to the provider's information.
❌ NPIDeactivationReasonCode: Reason code for NPI deactivation, if applicable.
🔚 NPIDeactivationDate: Date of NPI deactivation, if applicable.
🔙 NPIReactivationDate: Date of NPI reactivation, if applicable.
Provider Details
Provider Details
🚹🚺 ProviderGenderCode: Gender code of the provider (if an individual).
👤 AuthorizedOfficialLastName: Last name of the authorized official.
👤 AuthorizedOfficialFirstName: First name of the authorized official.
👤 AuthorizedOfficialMiddleName: Middle name of the authorized official.
📝 AuthorizedOfficialTitleorPosition: Title or position of the authorized official.
📞 AuthorizedOfficialTelephoneNumber: Telephone number of the authorized official.
Licensing and Taxonomy
(For brevity, the descriptions for Healthcare Provider Taxonomy Codes, Provider License Numbers, and State Codes are grouped together due to their repetitive nature across multiple entries.)
🧬 HealthcareProviderTaxonomyCode: Code indicating the provider's specific type or classification of health care supply.
🔑 ProviderLicenseNumber: License number assigned to the provider.
🗺️ ProviderLicenseNumberStateCode: State code where the provider is licensed.
🔀 HealthcareProviderPrimaryTaxonomySwitch: Indicates if the taxonomy code is the provider's primary code.
Other Identifiers
(Repeated for multiple other identifiers with type codes, states, and issuers.)
🔖 OtherProviderIdentifier: Other identifiers used to identify the provider.
🆔 OtherProviderIdentifierTypeCode: Type code of the other identifier.
🗺️ OtherProviderIdentifierState: State code related to the other identifier.
🏢 OtherProviderIdentifierIssuer: Issuer of the other identifier.
Organizational Details and Certification
❓ IsSoleProprietor: Indicates if the provider is a sole proprietor.
🏢 IsOrganizationSubpart: Indicates if the provider is a subpart of an organization.
🏢 ParentOrganizationLBN: Legal business name of the parent organization.
💼 ParentOrganizationTIN: Tax Identification Number of the parent organization.
📛 AuthorizedOfficialNamePrefixText: Prefix of the authorized official's name.
🏷️ AuthorizedOfficialNameSuffixText: Suffix of the authorized official's name.
🎓 AuthorizedOfficialCredentialText: Credentials of the authorized official.
🧩 HealthcareProviderTaxonomyGroup: Group taxonomy codes indicating shared characteristics.
This comprehensive outline provides a detailed understanding of the data structure, making it easier for educators and students alike to navigate and utilize the information effectively in various learning scenarios.
|
NekoJojo/modified_wider_face_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: labels
sequence: int64
- name: bbox
sequence:
sequence: float64
- name: valid_length
dtype: int64
- name: original_size
sequence: int64
- name: resized_bbox
sequence:
sequence: float64
splits:
- name: train
num_bytes: 3521537938.125
num_examples: 28735
download_size: 3291034354
dataset_size: 3521537938.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mhmd-mstf/ShadingDataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1805592413.0
num_examples: 77
download_size: 1803532519
dataset_size: 1805592413.0
---
# Dataset Card for "ShadingDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sethapun/arithmetic_2as_1to1 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: int64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 54000
num_examples: 2000
- name: validation
num_bytes: 10800
num_examples: 400
download_size: 5297
dataset_size: 64800
---
# Dataset Card for "arithmetic_2as_1to1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bstraehle/en-to-es-auto-finance | ---
license: apache-2.0
task_categories:
- translation
language:
- en
- es
tags:
- finance
- synthetic
pretty_name: English sentences and Spanish translations in the auto finance domain in random sort order
size_categories:
- 1K<n<10K
---
**What**: English sentences and Spanish translations in the auto finance domain in random sort order.
**Models**:
- meta-llama/Llama-2-70b-chat-hf
- mistralai/Mistral-7B-Instruct-v0.1
**Hyperparameters**: Temperature: 0.7
**System Prompt**: You are an English to Spanish translator with a professional tone.
**User Prompts**: Generate 100 unique English sentences and Spanish translation about <...> in JSON format.
- new car financing
- used car financing
- auto loans
- auto leases
- auto loan originations
- auto lease originations
- auto loan servicing
- auto lease servicing
- auto loan payment options
- auto lease payment options
- electric vehicle loans
- electric vehicle leases
- vehicle insurance
- vehicle damage |
nlp-brin-id/unsup-title-content | ---
license: apache-2.0
---
|
vgoldberg/longform_article_summarization | ---
language:
- en
license: apache-2.0
size_categories:
- 100K<n<1M
task_categories:
- summarization
pretty_name: Long-Form Article Summarization Dataset
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 2243293725
num_examples: 105256
download_size: 880664627
dataset_size: 2243293725
---
**Dataset Name:** Long-Form Article Summarization Dataset
**Description:**
The Long-Form Article Summarization Dataset is meticulously curated for the purpose of fine-tuning Natural Language Processing (NLP) models specifically tailored for summarization tasks. It is a rich collection of long-form articles that have been carefully condensed and summarized. The dataset provides a diverse range of topics and writing styles, making it an invaluable resource for researchers and practitioners working on summarization algorithms and applications.
**Data Sources:**
1. **Billsum:** This dataset includes summaries of U.S. congressional and state bills, providing insights into legislative documents.
2. **Scientific Papers:** A collection of scientific papers covering various disciplines, enabling a deep dive into research-oriented content.
3. **Multi_news:** This dataset incorporates news articles, offering a blend of current events and journalistic writing styles.
4. **CCDV/Pubmed-Summarization:** Focused on biomedical literature, this dataset contains summaries from Pubmed articles, offering specialized content related to the field of medicine and life sciences.
**Data Combination:**
The Long-Form Article Summarization Dataset is an amalgamation of the above-mentioned datasets. By combining these diverse sources, the dataset achieves a comprehensive coverage of topics, styles, and domains. This fusion enhances the dataset's versatility and applicability across a wide array of domains, making it a valuable asset for NLP research and development.
**Data Preprocessing:**
To ensure equal representation of unique domains and to manage the scale of the dataset, large datasets were down-sampled. This meticulous preprocessing step guarantees that each domain is adequately represented, promoting a balanced and unbiased training environment for NLP models.
**Intended Use:**
This dataset is specifically designed for fine-tuning NLP models focused on summarization tasks. Researchers and developers can utilize this dataset to train and evaluate their algorithms for generating concise and informative summaries from long-form articles. The dataset's diverse origins and careful preprocessing make it an ideal choice for enhancing the summarization capabilities of NLP models.
**Access:**
The Long-Form Article Summarization Dataset is available for research purposes and can be accessed through authorized channels. Researchers and developers interested in using this dataset are encouraged to adhere to ethical guidelines and data usage policies governing the respective sources.
**Citation:**
Researchers and practitioners are expected to cite the original sources of the datasets used in this amalgamation, namely "Billsum," "Scientific Papers," "Multi_news," and "CCDV/Pubmed-Summarization," in addition to acknowledging the creation of the Long-Form Article Summarization Dataset in their publications and research outputs.
This dataset card provides an overview of the Long-Form Article Summarization Dataset, outlining its sources, preprocessing methods, intended use, and access guidelines, ensuring transparent and responsible utilization of the valuable data it encapsulates.
|
wangdayaya/Celeb | ---
license: gpl
---
|
CyberHarem/haruka_amami_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of haruka_amami/天海春香/天海春香 (Azur Lane)
This is the dataset of haruka_amami/天海春香/天海春香 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `brown_hair, short_hair, green_eyes, ribbon, hair_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 577.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_amami_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 353.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_amami_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1154 | 728.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_amami_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 516.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_amami_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1154 | 1003.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruka_amami_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/haruka_amami_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blush, choker, hair_flower, open_mouth, skirt, solo, thighhighs, :d, looking_at_viewer, microphone, mismatched_legwear |
| 1 | 6 |  |  |  |  |  | 1girl, open_mouth, smile, solo, hair_bow, dress |
| 2 | 8 |  |  |  |  |  | 1girl, one_eye_closed, smile, solo, open_mouth, ;d, skirt, star_(symbol), v |
| 3 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, red_ribbon, solo, white_background, open_mouth, short_sleeves, simple_background, :d, bangs, blue_shirt, plaid_skirt, pleated_skirt, red_bow, school_uniform |
| 4 | 10 |  |  |  |  |  | 1girl, solo, bangs, blush, cleavage, looking_at_viewer, medium_breasts, navel, open_mouth, white_bikini, collarbone, day, outdoors, blue_sky, cloud, ocean, water, :d, cowboy_shot, frilled_bikini, jewelry, wet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | choker | hair_flower | open_mouth | skirt | solo | thighhighs | :d | looking_at_viewer | microphone | mismatched_legwear | smile | hair_bow | dress | one_eye_closed | ;d | star_(symbol) | v | red_ribbon | white_background | short_sleeves | simple_background | bangs | blue_shirt | plaid_skirt | pleated_skirt | red_bow | school_uniform | cleavage | medium_breasts | navel | white_bikini | collarbone | day | outdoors | blue_sky | cloud | ocean | water | cowboy_shot | frilled_bikini | jewelry | wet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------|:--------------|:-------------|:--------|:-------|:-------------|:-----|:--------------------|:-------------|:---------------------|:--------|:-----------|:--------|:-----------------|:-----|:----------------|:----|:-------------|:-------------------|:----------------|:--------------------|:--------|:-------------|:--------------|:----------------|:----------|:-----------------|:-----------|:-----------------|:--------|:---------------|:-------------|:------|:-----------|:-----------|:--------|:--------|:--------|:--------------|:-----------------|:----------|:------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | | X | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | | X | X | X | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | X | | X | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | | X | | X | | X | X | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1e690292 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1338
dataset_size: 186
---
# Dataset Card for "1e690292"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PP04/Sanskrit-Text-Summary | ---
license: unknown
---
|
Falah/chapter10_1_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2850
num_examples: 10
download_size: 4171
dataset_size: 2850
---
# Dataset Card for "chapter10_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/code_instructions_standardized_cluster_17 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 55875639
num_examples: 5267
download_size: 17314819
dataset_size: 55875639
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_drop_aux_be_gonna | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 16842
num_examples: 75
- name: dev_mismatched
num_bytes: 10806
num_examples: 60
- name: test_matched
num_bytes: 10661
num_examples: 46
- name: test_mismatched
num_bytes: 6325
num_examples: 33
- name: train
num_bytes: 621567
num_examples: 2510
download_size: 355017
dataset_size: 666201
---
# Dataset Card for "MULTI_VALUE_mnli_drop_aux_be_gonna"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yamashiro_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yamashiro/山城/山城 (Azur Lane)
This is the dataset of yamashiro/山城/山城 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `animal_ears, black_hair, short_hair, cat_ears, breasts, red_eyes, animal_ear_fluff, bangs, large_breasts, fang, mask_on_head, mismatched_eyebrows, tail, cat_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 810.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 412.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1303 | 933.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 692.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1303 | 1.38 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yamashiro_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, looking_at_viewer, school_swimsuit, jingle_bell, solo, tail_bell, blush, cleavage, mask, name_tag, open_mouth, white_thighhighs, collarbone, black_one-piece_swimsuit, blunt_bangs, covered_navel, :d, sitting |
| 1 | 6 |  |  |  |  |  | 1girl, black_kimono, looking_at_viewer, open_mouth, solo, white_thighhighs, wide_sleeves, :d, blush, sitting, blunt_bangs, cat_mask, long_sleeves, paw_pose, simple_background, white_background, medium_breasts, sideboob |
| 2 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_kimono, open_mouth, sideboob, wide_sleeves, blunt_bangs, upper_body, simple_background, white_background, cat_mask, blush, smile, fox_mask |
| 3 | 5 |  |  |  |  |  | 1girl, black_kimono, blush, looking_at_viewer, sideboob, solo, white_thighhighs, wide_sleeves, blunt_bangs, jingle_bell, open_mouth, short_kimono, long_sleeves, medium_breasts, sitting, cat_mask |
| 4 | 7 |  |  |  |  |  | 1girl, black_kimono, looking_at_viewer, open_mouth, solo, white_panties, white_thighhighs, wide_sleeves, blunt_bangs, jingle_bell, long_sleeves, short_kimono, sideboob, simple_background, :d, blush, cowboy_shot, paw_pose, standing, white_background, cat_mask, medium_breasts |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, fox_mask, hetero, open_mouth, paizuri, penis, solo_focus, black_kimono, blush, cum_on_breasts, looking_at_viewer, smile, facial, nipples, blunt_bangs, censored, pov, simple_background, upper_body, white_background |
| 6 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, navel, nipples, open_mouth, penis, pussy, sex, vaginal, nude, solo_focus, mask, spread_legs, thighhighs, bar_censor, heart, looking_at_viewer, lying, sweat |
| 7 | 7 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, wide_sleeves, bare_shoulders, detached_sleeves, smile, solo, torn_thighhighs, cat_mask, black_thighhighs, hand_on_own_face, nail_polish, red_nails, sitting, black_kimono, black_panties, fox_mask |
| 8 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, red_dress, solo, hair_flower, bare_shoulders, cleavage, open_mouth, official_alternate_costume, tail_bell, jingle_bell, paw_pose, black_pantyhose |
| 9 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, denim_shorts, off_shoulder, collarbone, long_sleeves, open_mouth, cat_girl, jingle_bell, midriff, official_alternate_costume, short_shorts, tail_bell, black_shirt, blunt_bangs, crop_top, navel, torn_shirt, blush, red_bikini, simple_background, torn_shorts, :d, bikini_under_clothes, blue_shorts, white_background, bare_shoulders, cowboy_shot, skin_fang, thick_eyebrows |
| 10 | 32 |  |  |  |  |  | 1girl, solo, serafuku, looking_at_viewer, pleated_skirt, white_shirt, short_sleeves, black_skirt, black_sailor_collar, red_neckerchief, white_thighhighs, blush, tail_bell, jingle_bell, miniskirt, open_mouth, smile, school_bag, zettai_ryouiki, cat_mask, midriff, navel, simple_background, white_background |
| 11 | 9 |  |  |  |  |  | 1girl, christmas, looking_at_viewer, solo, belt, santa_costume, santa_hat, blush, white_thighhighs, hood, gift_box, hair_ornament, open_mouth, red_skirt, scarf, :d, long_sleeves, medium_breasts |
| 12 | 5 |  |  |  |  |  | 1girl, black_leotard, detached_collar, looking_at_viewer, solo, strapless_leotard, wrist_cuffs, bare_shoulders, black_bowtie, brown_pantyhose, cleavage, rabbit_ears, black_pantyhose, fake_animal_ears, high_heels, rabbit_tail, tray, blush, covered_navel, drinking_glass, fishnet_pantyhose, holding, nontraditional_playboy_bunny, open_mouth, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | school_swimsuit | jingle_bell | solo | tail_bell | blush | cleavage | mask | name_tag | open_mouth | white_thighhighs | collarbone | black_one-piece_swimsuit | blunt_bangs | covered_navel | :d | sitting | black_kimono | wide_sleeves | cat_mask | long_sleeves | paw_pose | simple_background | white_background | medium_breasts | sideboob | upper_body | smile | fox_mask | short_kimono | white_panties | cowboy_shot | standing | 1boy | hetero | paizuri | penis | solo_focus | cum_on_breasts | facial | nipples | censored | pov | navel | pussy | sex | vaginal | nude | spread_legs | thighhighs | bar_censor | heart | lying | sweat | bare_shoulders | detached_sleeves | torn_thighhighs | black_thighhighs | hand_on_own_face | nail_polish | red_nails | black_panties | red_dress | hair_flower | official_alternate_costume | black_pantyhose | denim_shorts | off_shoulder | cat_girl | midriff | short_shorts | black_shirt | crop_top | torn_shirt | red_bikini | torn_shorts | bikini_under_clothes | blue_shorts | skin_fang | thick_eyebrows | serafuku | pleated_skirt | white_shirt | short_sleeves | black_skirt | black_sailor_collar | red_neckerchief | miniskirt | school_bag | zettai_ryouiki | christmas | belt | santa_costume | santa_hat | hood | gift_box | hair_ornament | red_skirt | scarf | black_leotard | detached_collar | strapless_leotard | wrist_cuffs | black_bowtie | brown_pantyhose | rabbit_ears | fake_animal_ears | high_heels | rabbit_tail | tray | drinking_glass | fishnet_pantyhose | holding | nontraditional_playboy_bunny |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:------------------|:--------------|:-------|:------------|:--------|:-----------|:-------|:-----------|:-------------|:-------------------|:-------------|:---------------------------|:--------------|:----------------|:-----|:----------|:---------------|:---------------|:-----------|:---------------|:-----------|:--------------------|:-------------------|:-----------------|:-----------|:-------------|:--------|:-----------|:---------------|:----------------|:--------------|:-----------|:-------|:---------|:----------|:--------|:-------------|:-----------------|:---------|:----------|:-----------|:------|:--------|:--------|:------|:----------|:-------|:--------------|:-------------|:-------------|:--------|:--------|:--------|:-----------------|:-------------------|:------------------|:-------------------|:-------------------|:--------------|:------------|:----------------|:------------|:--------------|:-----------------------------|:------------------|:---------------|:---------------|:-----------|:----------|:---------------|:--------------|:-----------|:-------------|:-------------|:--------------|:-----------------------|:--------------|:------------|:-----------------|:-----------|:----------------|:--------------|:----------------|:--------------|:----------------------|:------------------|:------------|:-------------|:-----------------|:------------|:-------|:----------------|:------------|:-------|:-----------|:----------------|:------------|:--------|:----------------|:------------------|:--------------------|:--------------|:---------------|:------------------|:--------------|:-------------------|:-------------|:--------------|:-------|:-----------------|:--------------------|:----------|:-------------------------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | X | | X | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | X | | | X | | X | | | | X | | | | X | | | | X | X | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | X | | X | | | | X | X | | | X | | | X | X | X | X | X | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | X | X | | X | | | | X | X | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | | | | X | | | | X | | | | X | | | | X | | | | | X | X | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | | | X | X | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 17 |  |  |  |  |  | X | X | | X | X | X | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | X | | X | X | X | X | X | | | X | | X | | X | | X | | | | | X | | X | X | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 32 |  |  |  |  |  | X | X | | X | X | X | X | | | | X | X | | | | | | | | | X | | | X | X | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 9 |  |  |  |  |  | X | X | | | X | | X | | | | X | X | | | | | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | X | | | X | | X | X | | | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Snoopy04/realtoxicity-1k | ---
dataset_info:
features:
- name: filename
dtype: string
- name: begin
dtype: int64
- name: end
dtype: int64
- name: challenging
dtype: bool
- name: prompt
struct:
- name: flirtation
dtype: float64
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: profanity
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexually_explicit
dtype: float64
- name: text
dtype: string
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: continuation
struct:
- name: flirtation
dtype: float64
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: profanity
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexually_explicit
dtype: float64
- name: text
dtype: string
- name: threat
dtype: float64
- name: toxicity
dtype: float64
splits:
- name: train
num_bytes: 335748
num_examples: 1000
download_size: 308983
dataset_size: 335748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Brizape/SETH_0404 | ---
dataset_info:
features:
- name: id
dtype: string
- name: texts
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 2425278
num_examples: 504
- name: test
num_bytes: 582671
num_examples: 126
download_size: 837941
dataset_size: 3007949
---
# Dataset Card for "SETH_0404"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bsbell21/MarketMailAI90 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 74916
num_examples: 90
download_size: 44608
dataset_size: 74916
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
polyhedralai/mining_concepts | ---
license: mit
---
|
AdapterOcean/data-standardized_cluster_11_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9622067
num_examples: 7804
download_size: 4176276
dataset_size: 9622067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_11_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shellypeng/violet-evergarden-ds | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 446708153.158
num_examples: 3823
download_size: 478066266
dataset_size: 446708153.158
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "violet-evergarden-ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaushikchan/catalog-sql | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 205065
num_examples: 550
download_size: 28269
dataset_size: 205065
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
euclaise/writingprompts | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: prompt
dtype: string
- name: story
dtype: string
splits:
- name: train
num_bytes: 858816216
num_examples: 272600
- name: test
num_bytes: 47681276
num_examples: 15138
- name: validation
num_bytes: 48904993
num_examples: 15620
download_size: 605049830
dataset_size: 955402485
---
# Dataset Card for "writingprompts"
WritingPrompts dataset, as used in [Hierarchical Neural Story Generation](https://arxiv.org/pdf/1805.04833.pdf). Parsed from [the archive](https://dl.fbaipublicfiles.com/fairseq/data/writingPrompts.tar.gz) |
ayoubelmhamdi/prompts-simplify-articles | ---
license: mit
---
10+3 prompts to fine-tune Llm to simplify Articles texts.
|
xNoper/gaofen_patch5000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
splits:
- name: train
num_bytes: 1968840564.0
num_examples: 5000
download_size: 1008691684
dataset_size: 1968840564.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
scfengv/TVL_Overall_Layer_topics | ---
task_categories:
- text-classification
language:
- zh
--- |
Tonyhacker/carlosdaniel_voicemakers | ---
license: openrail
---
|
open-llm-leaderboard/details_postbot__gpt2-medium-emailgen | ---
pretty_name: Evaluation run of postbot/gpt2-medium-emailgen
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [postbot/gpt2-medium-emailgen](https://huggingface.co/postbot/gpt2-medium-emailgen)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-19T16:44:21.952672](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public/blob/main/results_2023-11-19T16-44-21.952672.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24213502321663855,\n\
\ \"acc_stderr\": 0.030210866111969045,\n \"acc_norm\": 0.2431559232771965,\n\
\ \"acc_norm_stderr\": 0.031011858860463776,\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237269,\n \"mc2\": 0.43956041135282153,\n\
\ \"mc2_stderr\": 0.015361204238680572,\n \"em\": 0.0005243288590604027,\n\
\ \"em_stderr\": 0.00023443780464839703,\n \"f1\": 0.02527684563758395,\n\
\ \"f1_stderr\": 0.0009458090371986776\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22184300341296928,\n \"acc_stderr\": 0.012141659068147882,\n\
\ \"acc_norm\": 0.2645051194539249,\n \"acc_norm_stderr\": 0.012889272949313364\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.30541724756024696,\n\
\ \"acc_stderr\": 0.00459642622000091,\n \"acc_norm\": 0.3430591515634336,\n\
\ \"acc_norm_stderr\": 0.004737608340163401\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756191,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756191\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217897,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217897\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.03619604524124251,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.03619604524124251\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.267741935483871,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\"\
: 0.267741935483871,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1625615763546798,\n\
\ \"acc_stderr\": 0.02596030006460558,\n \"acc_norm\": 0.1625615763546798,\n\
\ \"acc_norm_stderr\": 0.02596030006460558\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.02869787397186068,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.02869787397186068\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463206,\n\
\ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463206\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871948,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871948\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882392,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882392\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25688073394495414,\n \"acc_stderr\": 0.018732492928342462,\n \"\
acc_norm\": 0.25688073394495414,\n \"acc_norm_stderr\": 0.018732492928342462\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906942,\n \
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906942\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n\
\ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.2556053811659193,\n\
\ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976235,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976235\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1794871794871795,\n\
\ \"acc_stderr\": 0.025140935950335442,\n \"acc_norm\": 0.1794871794871795,\n\
\ \"acc_norm_stderr\": 0.025140935950335442\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.01516202415227844,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.01516202415227844\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n\
\ \"acc_stderr\": 0.023222756797435105,\n \"acc_norm\": 0.21221864951768488,\n\
\ \"acc_norm_stderr\": 0.023222756797435105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.022535006705942825,\n\
\ \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.022535006705942825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\
\ \"acc_stderr\": 0.010956556654417353,\n \"acc_norm\": 0.24315514993481094,\n\
\ \"acc_norm_stderr\": 0.010956556654417353\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877743,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.03106939026078942,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.03106939026078942\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237269,\n \"mc2\": 0.43956041135282153,\n\
\ \"mc2_stderr\": 0.015361204238680572\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.0140519560640769\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0005243288590604027,\n \
\ \"em_stderr\": 0.00023443780464839703,\n \"f1\": 0.02527684563758395,\n\
\ \"f1_stderr\": 0.0009458090371986776\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/postbot/gpt2-medium-emailgen
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|arc:challenge|25_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|drop|3_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|gsm8k|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hellaswag|10_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T16-44-21.952672.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- '**/details_harness|winogrande|5_2023-11-19T16-44-21.952672.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-19T16-44-21.952672.parquet'
- config_name: results
data_files:
- split: 2023_11_19T16_44_21.952672
path:
- results_2023-11-19T16-44-21.952672.parquet
- split: latest
path:
- results_2023-11-19T16-44-21.952672.parquet
---
# Dataset Card for Evaluation run of postbot/gpt2-medium-emailgen
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/gpt2-medium-emailgen
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/gpt2-medium-emailgen](https://huggingface.co/postbot/gpt2-medium-emailgen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T16:44:21.952672](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt2-medium-emailgen_public/blob/main/results_2023-11-19T16-44-21.952672.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24213502321663855,
"acc_stderr": 0.030210866111969045,
"acc_norm": 0.2431559232771965,
"acc_norm_stderr": 0.031011858860463776,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237269,
"mc2": 0.43956041135282153,
"mc2_stderr": 0.015361204238680572,
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464839703,
"f1": 0.02527684563758395,
"f1_stderr": 0.0009458090371986776
},
"harness|arc:challenge|25": {
"acc": 0.22184300341296928,
"acc_stderr": 0.012141659068147882,
"acc_norm": 0.2645051194539249,
"acc_norm_stderr": 0.012889272949313364
},
"harness|hellaswag|10": {
"acc": 0.30541724756024696,
"acc_stderr": 0.00459642622000091,
"acc_norm": 0.3430591515634336,
"acc_norm_stderr": 0.004737608340163401
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756191,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756191
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217897,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217897
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124251,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124251
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1625615763546798,
"acc_stderr": 0.02596030006460558,
"acc_norm": 0.1625615763546798,
"acc_norm_stderr": 0.02596030006460558
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.02869787397186068,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.02869787397186068
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463206,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463206
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871948,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871948
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882392,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882392
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25688073394495414,
"acc_stderr": 0.018732492928342462,
"acc_norm": 0.25688073394495414,
"acc_norm_stderr": 0.018732492928342462
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906942,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906942
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2556053811659193,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.2556053811659193,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976235,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976235
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1794871794871795,
"acc_stderr": 0.025140935950335442,
"acc_norm": 0.1794871794871795,
"acc_norm_stderr": 0.025140935950335442
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.01516202415227844,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.01516202415227844
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21221864951768488,
"acc_stderr": 0.023222756797435105,
"acc_norm": 0.21221864951768488,
"acc_norm_stderr": 0.023222756797435105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20679012345679013,
"acc_stderr": 0.022535006705942825,
"acc_norm": 0.20679012345679013,
"acc_norm_stderr": 0.022535006705942825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.010956556654417353,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.010956556654417353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877743,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078942,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078942
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237269,
"mc2": 0.43956041135282153,
"mc2_stderr": 0.015361204238680572
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.0140519560640769
},
"harness|drop|3": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464839703,
"f1": 0.02527684563758395,
"f1_stderr": 0.0009458090371986776
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Yhyu13__LMCocktail-Mistral-7B-v1 | ---
pretty_name: Evaluation run of Yhyu13/LMCocktail-Mistral-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yhyu13/LMCocktail-Mistral-7B-v1](https://huggingface.co/Yhyu13/LMCocktail-Mistral-7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yhyu13__LMCocktail-Mistral-7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T14:28:28.238573](https://huggingface.co/datasets/open-llm-leaderboard/details_Yhyu13__LMCocktail-Mistral-7B-v1/blob/main/results_2023-12-29T14-28-28.238573.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6174576993689161,\n\
\ \"acc_stderr\": 0.03283982884760222,\n \"acc_norm\": 0.6212160745049035,\n\
\ \"acc_norm_stderr\": 0.0334940996283564,\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6137157589987131,\n\
\ \"mc2_stderr\": 0.015482351528764331\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192601,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.01382204792228351\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6635132443736308,\n\
\ \"acc_stderr\": 0.004715419139697518,\n \"acc_norm\": 0.8569010157339175,\n\
\ \"acc_norm_stderr\": 0.0034945810763985425\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.027869320571664632,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.027869320571664632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.01640712303219525,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.01640712303219525\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.029520095697687758,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.029520095697687758\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573702,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573702\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6137157589987131,\n\
\ \"mc2_stderr\": 0.015482351528764331\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698336\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4723275208491281,\n \
\ \"acc_stderr\": 0.013751375538801331\n }\n}\n```"
repo_url: https://huggingface.co/Yhyu13/LMCocktail-Mistral-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|arc:challenge|25_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|gsm8k|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hellaswag|10_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T14-28-28.238573.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T14-28-28.238573.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- '**/details_harness|winogrande|5_2023-12-29T14-28-28.238573.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T14-28-28.238573.parquet'
- config_name: results
data_files:
- split: 2023_12_29T14_28_28.238573
path:
- results_2023-12-29T14-28-28.238573.parquet
- split: latest
path:
- results_2023-12-29T14-28-28.238573.parquet
---
# Dataset Card for Evaluation run of Yhyu13/LMCocktail-Mistral-7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yhyu13/LMCocktail-Mistral-7B-v1](https://huggingface.co/Yhyu13/LMCocktail-Mistral-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yhyu13__LMCocktail-Mistral-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T14:28:28.238573](https://huggingface.co/datasets/open-llm-leaderboard/details_Yhyu13__LMCocktail-Mistral-7B-v1/blob/main/results_2023-12-29T14-28-28.238573.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6174576993689161,
"acc_stderr": 0.03283982884760222,
"acc_norm": 0.6212160745049035,
"acc_norm_stderr": 0.0334940996283564,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6137157589987131,
"mc2_stderr": 0.015482351528764331
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.014163366896192601,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.01382204792228351
},
"harness|hellaswag|10": {
"acc": 0.6635132443736308,
"acc_stderr": 0.004715419139697518,
"acc_norm": 0.8569010157339175,
"acc_norm_stderr": 0.0034945810763985425
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6,
"acc_stderr": 0.027869320571664632,
"acc_norm": 0.6,
"acc_norm_stderr": 0.027869320571664632
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.01640712303219525,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.01640712303219525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.029520095697687758,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.029520095697687758
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573702,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573702
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6137157589987131,
"mc2_stderr": 0.015482351528764331
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698336
},
"harness|gsm8k|5": {
"acc": 0.4723275208491281,
"acc_stderr": 0.013751375538801331
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
KenLuo/EMPEC | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: "train.jsonl"
- split: validation
path: "dev.jsonl"
- split: test
path: "test_8k.jsonl"
---
EMPEC(Examinations for Medical PErsonnel in Chinese) collects the recent 10 years of multi-choice questions from the Professional and Technical Examinations for Medical Personnel of the Republic of China.
We collect tests for various medical professionals such as Medical Technologist, Medical Radiation Technologist, Registered Professional Nurse, Physical Therapist et. al. There are in total of 81761 single-choice questions covering a wide range of subjects including General Clinical Psychology, Anatomy and Physiology, Fundamentals of Respiratory Care, and Occupational Therapy Techniques et.al.
EMPEC forms a remarkable challenge for AI models and can serve as an effective tool to evaluate models' medical knowledge encoded in Chinese. We hope EMPEC could support the exploration and building of Large Multi-lingual or Chinese Language Models, especially in the medical domain.
If you find EMPEC useful, please consider citing us.
## Citation
```
@misc{EMPEC,
title={EMPEC, Examinations-for-Medical-PErsonnel-in-Chinese},
author={Zheheng Luo},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/zhehengluoK/Examinations-for-Medical-PErsonnel-in-Chinese}},
}
``` |
redflash/event_scheduling | ---
license: apache-2.0
---
|
ibivibiv/alpaca_tasksource4 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 135301515
num_examples: 253970
download_size: 76886774
dataset_size: 135301515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Prajapat/grammer_correction_llama2 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: corrections
sequence: string
- name: text
dtype: string
splits:
- name: validation
num_bytes: 789403
num_examples: 755
download_size: 269534
dataset_size: 789403
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
onkar627/MentalHealth | ---
license: mit
---
|
indiansatoshi/ukpop | ---
license: apache-2.0
---
|
sethapun/imdb_misspelled_20 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 33633433
num_examples: 25000
- name: validation
num_bytes: 32850078
num_examples: 25000
download_size: 49040121
dataset_size: 66483511
---
# Dataset Card for "imdb_misspelled_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MedAliFarhat/test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 423726
num_examples: 100
download_size: 239606
dataset_size: 423726
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
collabteza/sys-human_db3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: System Prompt
dtype: string
- name: Human Prompt
dtype: string
- name: Output
dtype: string
splits:
- name: train
num_bytes: 1092224
num_examples: 1354
download_size: 481074
dataset_size: 1092224
---
# Dataset Card for "sys-human_db3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jmcastelo17/FIFA_dataset | ---
dataset_info:
features:
- name: audio
dtype: binary
- name: text
dtype: string
splits:
- name: train
num_bytes: 328939441
num_examples: 296
download_size: 324971288
dataset_size: 328939441
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-samsum-samsum-89ef9c-1465453967 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
BerMaker/test | ---
license: apache-2.0
task_categories:
- text-classification
tags:
- code
- art
size_categories:
- n<1K
--- |
determined-ai/mbpp_short | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: code
dtype: string
splits:
- name: train
num_bytes: 43506
num_examples: 227
- name: test
num_bytes: 54302
num_examples: 291
- name: validation
num_bytes: 9398
num_examples: 51
download_size: 56077
dataset_size: 107206
---
# Dataset Card for "mbpp_short"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AA051610__VA | ---
pretty_name: Evaluation run of AA051610/VA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/VA](https://huggingface.co/AA051610/VA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__VA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T07:22:26.417131](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__VA/blob/main/results_2023-10-11T07-22-26.417131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4972405415581996,\n\
\ \"acc_stderr\": 0.03512578000813228,\n \"acc_norm\": 0.5002960487991649,\n\
\ \"acc_norm_stderr\": 0.03512615731416433,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44928868954080875,\n\
\ \"mc2_stderr\": 0.014916546411376396\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3848122866894198,\n \"acc_stderr\": 0.014218371065251105,\n\
\ \"acc_norm\": 0.4138225255972696,\n \"acc_norm_stderr\": 0.014392730009221007\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47390957976498704,\n\
\ \"acc_stderr\": 0.004982983592459198,\n \"acc_norm\": 0.6251742680740888,\n\
\ \"acc_norm_stderr\": 0.004830885704380092\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270658,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270658\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.048523658709390974,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709390974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.567741935483871,\n\
\ \"acc_stderr\": 0.028181739720019416,\n \"acc_norm\": 0.567741935483871,\n\
\ \"acc_norm_stderr\": 0.028181739720019416\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846475,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846475\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6128440366972477,\n \"acc_stderr\": 0.02088423199264345,\n \"\
acc_norm\": 0.6128440366972477,\n \"acc_norm_stderr\": 0.02088423199264345\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319355,\n \"\
acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319355\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.029343114798094462,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.029343114798094462\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\
\ \"acc_stderr\": 0.0167063814150579,\n \"acc_norm\": 0.6781609195402298,\n\
\ \"acc_norm_stderr\": 0.0167063814150579\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.02661335084026174,\n\
\ \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.02661335084026174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468628,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468628\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n\
\ \"acc_stderr\": 0.028196400574197426,\n \"acc_norm\": 0.5594855305466238,\n\
\ \"acc_norm_stderr\": 0.028196400574197426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.02774431344337654,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.02774431344337654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284073,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284073\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.012683972513598806,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.012683972513598806\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.02018014484330729,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.02018014484330729\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6081871345029239,\n \"acc_stderr\": 0.03743979825926401,\n\
\ \"acc_norm\": 0.6081871345029239,\n \"acc_norm_stderr\": 0.03743979825926401\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44928868954080875,\n\
\ \"mc2_stderr\": 0.014916546411376396\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/VA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|arc:challenge|25_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hellaswag|10_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T07-22-26.417131.parquet'
- config_name: results
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- results_2023-10-11T07-22-26.417131.parquet
- split: latest
path:
- results_2023-10-11T07-22-26.417131.parquet
---
# Dataset Card for Evaluation run of AA051610/VA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/VA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/VA](https://huggingface.co/AA051610/VA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__VA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T07:22:26.417131](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__VA/blob/main/results_2023-10-11T07-22-26.417131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4972405415581996,
"acc_stderr": 0.03512578000813228,
"acc_norm": 0.5002960487991649,
"acc_norm_stderr": 0.03512615731416433,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.44928868954080875,
"mc2_stderr": 0.014916546411376396
},
"harness|arc:challenge|25": {
"acc": 0.3848122866894198,
"acc_stderr": 0.014218371065251105,
"acc_norm": 0.4138225255972696,
"acc_norm_stderr": 0.014392730009221007
},
"harness|hellaswag|10": {
"acc": 0.47390957976498704,
"acc_stderr": 0.004982983592459198,
"acc_norm": 0.6251742680740888,
"acc_norm_stderr": 0.004830885704380092
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709390974,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709390974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.567741935483871,
"acc_stderr": 0.028181739720019416,
"acc_norm": 0.567741935483871,
"acc_norm_stderr": 0.028181739720019416
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846475,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846475
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6128440366972477,
"acc_stderr": 0.02088423199264345,
"acc_norm": 0.6128440366972477,
"acc_norm_stderr": 0.02088423199264345
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212093,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212093
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.029343114798094462,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.029343114798094462
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.0167063814150579,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.0167063814150579
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5751445086705202,
"acc_stderr": 0.02661335084026174,
"acc_norm": 0.5751445086705202,
"acc_norm_stderr": 0.02661335084026174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468628,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468628
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5594855305466238,
"acc_stderr": 0.028196400574197426,
"acc_norm": 0.5594855305466238,
"acc_norm_stderr": 0.028196400574197426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284073,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284073
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598806,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.02018014484330729,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.02018014484330729
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6081871345029239,
"acc_stderr": 0.03743979825926401,
"acc_norm": 0.6081871345029239,
"acc_norm_stderr": 0.03743979825926401
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.44928868954080875,
"mc2_stderr": 0.014916546411376396
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jtatman/CoT_reformatted_preprocessed | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 6012900447.762087
num_examples: 1141963
- name: eval
num_bytes: 131635.1853729518
num_examples: 25
download_size: 648216462
dataset_size: 6013032082.94746
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---
|
amishshah/balanced | ---
dataset_info:
features:
- name: title
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 58351317.12
num_examples: 27000
- name: test
num_bytes: 6483479.68
num_examples: 3000
- name: eval
num_bytes: 6483479.68
num_examples: 3000
download_size: 3311033
dataset_size: 71318276.47999999
---
# Dataset Card for "balanced"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrm8488/databricks-dolly-15k-curated-es | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: instruction_original_en
dtype: string
- name: context_original_en
dtype: string
- name: response_original_en
dtype: string
- name: id
dtype: int64
splits:
- name: es
num_bytes: 25902709
num_examples: 15015
download_size: 16490137
dataset_size: 25902709
---
# Dataset Card for "databricks-dolly-15k-curated-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/hugitnovtest | Invalid username or password. |
open-llm-leaderboard/details_tiiuae__falcon-rw-1b | ---
pretty_name: Evaluation run of tiiuae/falcon-rw-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tiiuae/falcon-rw-1b](https://huggingface.co/tiiuae/falcon-rw-1b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tiiuae__falcon-rw-1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T18:16:05.784566](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-rw-1b/blob/main/results_2023-10-25T18-16-05.784566.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652193675,\n \"f1\": 0.0464429530201344,\n\
\ \"f1_stderr\": 0.001186214815178995,\n \"acc\": 0.31283505657403515,\n\
\ \"acc_stderr\": 0.007820275562329611\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652193675,\n\
\ \"f1\": 0.0464429530201344,\n \"f1_stderr\": 0.001186214815178995\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6203630623520127,\n \"acc_stderr\": 0.013639245403711165\n\
\ }\n}\n```"
repo_url: https://huggingface.co/tiiuae/falcon-rw-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T18_16_05.784566
path:
- '**/details_harness|drop|3_2023-10-25T18-16-05.784566.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T18-16-05.784566.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T18_16_05.784566
path:
- '**/details_harness|gsm8k|5_2023-10-25T18-16-05.784566.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T18-16-05.784566.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-16-44.792936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T16-16-44.792936.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T18_16_05.784566
path:
- '**/details_harness|winogrande|5_2023-10-25T18-16-05.784566.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T18-16-05.784566.parquet'
- config_name: results
data_files:
- split: 2023_09_13T16_16_44.792936
path:
- results_2023-09-13T16-16-44.792936.parquet
- split: 2023_10_25T18_16_05.784566
path:
- results_2023-10-25T18-16-05.784566.parquet
- split: latest
path:
- results_2023-10-25T18-16-05.784566.parquet
---
# Dataset Card for Evaluation run of tiiuae/falcon-rw-1b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tiiuae/falcon-rw-1b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [tiiuae/falcon-rw-1b](https://huggingface.co/tiiuae/falcon-rw-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tiiuae__falcon-rw-1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T18:16:05.784566](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-rw-1b/blob/main/results_2023-10-25T18-16-05.784566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652193675,
"f1": 0.0464429530201344,
"f1_stderr": 0.001186214815178995,
"acc": 0.31283505657403515,
"acc_stderr": 0.007820275562329611
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652193675,
"f1": 0.0464429530201344,
"f1_stderr": 0.001186214815178995
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480574
},
"harness|winogrande|5": {
"acc": 0.6203630623520127,
"acc_stderr": 0.013639245403711165
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_qqp_irrealis_be_done | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 569355
num_examples: 2766
- name: test
num_bytes: 5610655
num_examples: 27990
- name: train
num_bytes: 5166930
num_examples: 25186
download_size: 6979796
dataset_size: 11346940
---
# Dataset Card for "MULTI_VALUE_qqp_irrealis_be_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mounikaiiith/Telugu-Hatespeech | ---
license: cc-by-4.0
---
Do cite the below references for using the dataset:
@article{marreddy2022resource, title={Am I a Resource-Poor Language? Data Sets, Embeddings, Models and Analysis for four different NLP tasks in Telugu Language},
author={Marreddy, Mounika and Oota, Subba Reddy and Vakada, Lakshmi Sireesha and Chinni, Venkata Charan and Mamidi, Radhika},
journal={Transactions on Asian and Low-Resource Language Information Processing}, publisher={ACM New York, NY} }
@article{marreddy2022multi,
title={Multi-Task Text Classification using Graph Convolutional Networks for Large-Scale Low Resource Language},
author={Marreddy, Mounika and Oota, Subba Reddy and Vakada, Lakshmi Sireesha and Chinni, Venkata Charan and Mamidi, Radhika},
journal={arXiv preprint arXiv:2205.01204},
year={2022}
}
|
lhoestq/multi-configs | ---
dataset_info:
- config_name: bar
features:
- name: a
dtype: string
splits:
- name: train
num_bytes: 35
num_examples: 5
download_size: 0
dataset_size: 35
- config_name: foo
features:
- name: a
dtype: string
splits:
- name: train
num_bytes: 35
num_examples: 5
download_size: 0
dataset_size: 35
configs:
- config_name: bar
data_files: bar/train-*
- config_name: foo
data_files: foo/train-*
---
# Dataset Card for "multi-configs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/superb_sd | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: descript_audio_codec
path: data/descript_audio_codec-*
- split: encodec_hf
path: data/encodec_hf-*
- split: speech_tokenizer
path: data/speech_tokenizer-*
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: original
num_bytes: 805311663.538
num_examples: 3002
- name: descript_audio_codec
num_bytes: 2219303148.506
num_examples: 3002
- name: encodec_hf
num_bytes: 1207945000.934
num_examples: 3002
- name: speech_tokenizer
num_bytes: 806155333.61
num_examples: 3002
download_size: 5056314536
dataset_size: 5038715146.588
---
# Dataset Card for "superb_sd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.